Browsing articles in "Blog"

Cyber Security Targets

Mar 24, 2014   //   by admin   //   Blog  //  Comments Off

RFG Perspective: While the total cost of the cybersecurity breach at Target will not be know for quite a while, a reasonable estimate is that it could easily cost the company more than $500 million. The price tag includes bills associated with fines from credit card companies, other fines and lawsuits for non-compliance, services such as free credit card report monitoring for its impacted 70 -110 million customers, and discounts required to keep customers coming in the door. These costs far exceed the IT costs associated with better cybersecurity prevention. Target is not alone; it is just the latest in a long line of breaches that have taken major tolls on the attacked organization. Business and IT executives need to recognize that attackers and hackers will constantly change their multi-pronged sophisticated attack strategies as they attempt to stay ahead of the protections installed in the enterprises. IT executives need to be constantly aware of the risk exposures and how they are changing, and continue to invest in measured, integrated cybersecurity solutions to close the gaps.

The Target cyber breach represents a new twist to the long-standing cybersecurity challenge. Unlike most other attacks that came through direct probes into the corporate network or through employee social-engineered emails, spear phishing, or multi-vectored malware aimed at IT software, the Target incident was an Operations Technology (OT) play. One reason for this may be that the vendor patch rate has improved and successes of zero-day exploits are dropping. Of course, it could also be that the misguided actors were clever enough to try a new attack vector.

IT vs OT

Most IT executives and staff give little thought to OT software, usually referred to as SCADA (supervisory control and data acquisition) software. These are industrial control systems that monitor and control things such as air conditioning, civil defense systems, heating, manufacturing lines, power generation, power usage, transmission lines, and water treatment. IT (outside of the utilities industry) tends to treat these systems and the associated software as outside of their purview. This is no longer true. Cyber attackers are constantly upping the ante and now they have begun going after OT software in addition to traditional attack vectors. IT executives and security personnel need to become actively engaged in ensuring the organization is protected against these types of threats.

Incident Attack Types

In 2013 according to the IBM X-Force Threat Intelligence Quarterly 1Q2014, the top three disclosed attack types are distributed denial of service (DDoS), SQL injections, and malware. These three vectors account for 43 percent of 8,330 vulnerability disclosures while another 46 percent of attack types remain undisclosed. (See below chart from the IBM report.) The report also points out that Java vulnerabilities continue to rise year-over-year with them tripling in the last year alone. Fully half of the exploited application vulnerabilities were Java based, with Adobe Reader and Internet browsers accounting for 22 and 13 percent respectively. Interestingly, mobile devices excluding laptops have yet to be a major threat attack point.

most common attack types

Currency

Another common pressure point on IT organizations is keeping current with all the security patches authorized by software providers. The good news is that vendors and IT organizations are doing a better job applying patches. The overall unpatched publicly-disclosed vulnerability rate dropped from 41 percent in 2102 to 26 percent in 2013. This is great progress but still much remains to be done, especially by enterprise IT. The amount of patches to be applied on an ongoing basis can be overwhelming and many IT organizations cannot keep up, especially with quick fixes. Thus, zero-day exploits still remain major threats that IT needs to mitigate.

Playing Defense

The challenge for IT CISOs and security staff increases every year as the number and types of actors attempting to gain access to IT systems continues to grow as do the types of attacks. Therefore, enterprises must reduce their risk exposure by using monitoring and blocking software that can rapidly detect problems almost as they occur and shut off attacks immediately before the exposure becomes too large. Additionally, staff must fine-tune access controls and patch known vulnerabilities quickly so as to (virtually) eliminate the ability for criminals to exploit holes in infrastructures. Security executives and staff should work collaboratively with others in their field and share information about attacks, defenses, meaningful metrics, and trends. IT executives should ensure security personnel are continually trained and aware of the latest trends and are implementing the appropriate defenses as rapidly as possible. As people are one of the weakest links in the security chain, IT executives should also ensure all employees are aware of company privacy and security policies and procedures and are judiciously following them.

RFG POV: IT executives and cyber security staff remain behind the curve in protecting, exfiltrating, discovering, and containing cyber security attacks and data breaches. There are some low-hanging initiatives IT can execute to close some of the major vulnerabilities such as blocking troublesome IP addresses at the perimeter outside the firewall and employing enhanced software monitoring tools that can spot and alert security of suspect software. Additionally, staff can improve password requirements, password change frequency, two-factor authentication, inclusion of OT software, and rapid deactivation of access (cyber and physical) to terminated employees. Encryption of data at rest and in transit should also be evaluated. However, IT are not the only ones on the line for corporate security – the board of directors and corporate executives share the fiduciary burden for protecting company assets. IT executives should get boards and corporate executives to understand the challenges, establish the acceptable risk parameters, and play an ongoing role in security governance. IT security executives should work with appropriate parties to collect, analyze, and share incident data so that defenses and detection can be enhanced. IT executives should also recognize that cyber security is not just about technology – the weakest links are the people and processes. These gaps should be aggressively pursued and the problems regularly communicated across the organization. The investment in these corrective actions will be far less than the cost of fixing the problem once the damage is done.

Have You Been Robbed on the Last Mile of Sales?

Feb 16, 2014   //   by admin   //   Blog  //  Comments Off

It is a fair question, whether you are the seller or the customer. OK, so what is the last mile of sales? I didn’t find an official definition, so I’m borrowing the concept from “the last mile of finance” between balance sheet and 10-k, and the “last mile of telecommunications” that is the copper wire from the common carrier’s substation to your home or business. Let’s call the last mile of sales

that part of the sales funnel in which prospects are ready to become customers, or are already customers, ready for up-selling and cross-selling.

 

Survey Participants were robbed on the Last Mile of Sales!

This Tuesday morning, I looked at our “Poor Data Quality – Negative Business Outcomes” survey results and I noticed a surprising agreement among participants in one sales-related area. 126 respondents, or over 90%of those responding to our question about poor data quality compromising up-selling and cross-selling, indicated they had such a problem. The graph following gives you a sense of how large a percentage of respondents had lost sales opportunities.

robbed on the last mile of sales

  This is a troubling statistic. Organizations spend huge sums on marketing programs designed to attract prospects and nurture them to become customers. Beyond direct monetary investment, ensuring a successful trip down the sales funnel takes time, effort, and ability. From the perspective of the seller, failing to sell more products and services to an robbed on the last mile of salesexisting (presumably happy) client is like being robbed on the last mile of sales. Your organization has already succeeded in making a first sale. Subsequent selling should be easier, not harder. From the perspective of the buyer, losing confidence in your chosen vendor because they fail to know you and your preferences, confuse you with similarly named customers, or display inept record-keeping about their last contact with you, robs you of a relationship you had invested time and money in developing. Perhaps now your go-to vendor becomes your former vendor, and you must spend time seeking an alternate source. Once confidence has been shaken, it is difficult to rebuild.

What did the survey say?

How is it possible that more than 90% of our respondents to this question lost an opportunity to up-sell or cross-sell? The next chart tells the story. It is poor data quality, plain and simple. robbed on the last mile of sales You can read the results yourself. As a sales prospect for a lead generation service, I had a recent experience with at least one of the top four poor data quality problems.

Oops, the status wasn’t updated after our last call

In the closing months of 2013, I was solicited by a lead generation firm. I asked them to contact me in the first quarter of 2014. Ten days into 2014, they called again. OK, perhaps a bit early in the quarter, but they are eager for my business. With no immediate need, I asked them to call me again in Q3-2014 to see how things were evolving. So, I was surprised when I received another call from that firm, yesterday. Had we traveled through a time-warp? Was it now mid-summer? A look out the window at the snowstorm in progress suggested it was still February 2014. The caller was the same person as last time, and began an identical spiel. I interrupted and mentioned we had only spoken a week earlier. The caller appeared to remember and agree, indicating that there was no status update about the previous call. Was this sloppy ball-handling by sales, an IT technology issue, an ill-timed database restore? Was this a 1:1,000,000 chance or an everyday occurrence? The answer to all of those questions is “I have no idea, but I don’t want to trust these folks with managing my lead generation campaign”. If they can’t handle their own sales process, how are they going to help me with mine? What ever the cause of the gaff, they robbed themselves of a prospect, and me of any confidence I might have had in them.

The Bottom Line

Being robbed on the last mile of sales by poor data quality is unnecessary, but all too common. Have you recently been robbed on the last mile of sales? Are you a seller, or a disappointed prospect or customer? Cal Braunstein of The Robert Frances Group and I would like to hear from you. Please do contact me to set up an appointment for a conversation. Whether you have already participated in our survey, are a member of the InfoGov community, or simply have an enlightening experience about how poor data quality caused you to have a negative business outcome, reach out and let us know.

Published by permission of Stuart Selip, Principal Consulting LLC

Predictions: Tech Trends – part 1 – 2014

Jan 20, 2014   //   by admin   //   Blog  //  Comments Off

RFG Perspective: The global economic headwinds in 2014, which constrain IT budgets, will force IT executives to question certain basic assumptions and reexamine current and target technology solutions. There are new waves of next-generation technologies emerging and maturing that challenge the existing status quo and deserve IT executive attention. These technologies will improve business outcomes as well as spark innovation and drive down the cost of IT services and solutions. IT executives will have to work with business executives fund the next-generation technologies or find self-funding approaches to implementing them. IT executives will also have to provide the leadership needed for properly selecting and implementing cloud solutions or control will be assumed by business executives that usually lack all the appropriate skills for tackling outsourced IT solutions.

As mentioned in the RFG blog “IT and the Global Economy – 2014” the global economic environment may not be as strong as expected, thereby keeping IT budgets contained or shrinking. Therefore, IT executives will need to invest in next-generation technology to contain costs, minimize risks, improve resource utilization, and deliver the desired business outcomes. Below are a few key areas that RFG believes will be the major technology initiatives that will get the most attention.

Tech-driven Business Transformation

 

 

 

 

 

 

 

 

 

 

 

 

 

Source: RFG
Analytics – In 2014, look for analytics service and solution providers to boost usability of their products to encompass the average non-technical knowledge worker by moving closer to a “Google-like” search and inquiry experience in order to broaden opportunities and increase market share.

Big Data – Big Data integration services and solutions will grab the spotlight this year as organizations continue to ratchet up the volume, variety and velocity of data while seeking increased visibility, veracity and insight from their Big Data sources.

Cloud – Infrastructure as a Service (IaaS) will continue to dominate as a cloud solution over Platform as a Service (PaaS), although the latter is expected to gain momentum and market share. Nonetheless, Software as a Service (SaaS) will remain the cloud revenue leader with Salesforce.com the dominant player. Amazon Web Services will retain its overall leadership of IaaS/PaaS providers with Google, IBM, and Microsoft Azure holding onto the next set of slots. Rackspace and Oracle have a struggle ahead to gain market share, even as OpenStack (an open cloud architecture) gains momentum.

Cloud Service Providers (CSPs) – CSPs will face stiffer competition and pricing pressures as larger players acquire or build new capabilities and new, innovative open-source based solutions enter the new year with momentum as large, influential organizations look to build and share their own private and public cloud standards and APIs to lower infrastructure costs.

Consolidation – Data center consolidation will continue as users move applications and services to the cloud and standardized internal platforms that are intended to become cloud-like. Advancements in cloud offerings along with a diminished concern for security (more of a false hope than reality) will lead to more small and mid-sized businesses (SMBs) to shift processing to the cloud and operate fewer internal data center sites. Large enterprises will look to utilize clouds and colocation sites for development/test environments and handling spikes in capacity rather than open or grow in-house sites.

Containerization – Containerization (or modularization) is gaining acceptance by many leading-edge companies, like Google and Microsoft, but overall adoption is slow, as IT executives have yet to figure out how to deal with the technology. It is worth noting that the power usage effectiveness (PUE) of these solutions is excellent and has been known to be as low as 1.05 (whereas the average remains around 1.90).

Data center transformation – In order to achieve the levels of operational efficiency required, IT executives will have to increase their commitment to data center transformation. The productivity improvements will be achieved through the use of the shift from standalone vertical stack management to horizontal layer management, relationship management, and use of cloud technologies. One of the biggest effects of this shift is an actual reduction in operations headcount and reorientation of skills and talents to the new processes. IT executives should look for the transformation to be a minimum of a three year process. However, IT operations executives should not expect clear sailing as development shops will push back to prevent loss of control of their application environments.

3-D printing – 2014 will see the beginning of 3-D printing taking hold. Over time the use of 3-D printing will revolutionize the way companies produce materials and provide support services. Leading-edge companies will be the first to apply the technology this year and thereby gain a competitive advantage.

Energy efficiency/sustainability – While this is not new news in 2014, IT executives should be making it a part of other initiatives and a procurement requirement. RFG studies find that energy savings is just the tip of the iceberg (about 10 percent) that can be achieved when taking advantage of newer technologies. RFG studies show that in many cases the energy savings from removing hardware kept more than 40 months can usually pay for new better utilized equipment. Or, as an Intel study found, servers more than four years old accounted for four percent of the relative performance capacity yet consumed 60 percent of the power.

Hyperscale computing (HPC) – RFG views hyperscale computing as the next wave of computing that will replace the low end of the traditional x86 server market. The space is still in its infancy, with the primary players Advanced Micro Devices (AMD) SeaMicro solutions and Hewlett-Packard’s (HP’s) Moonshot server line. While penetration will be low in 2014, the value proposition for HPC solutions should be come evident.

Integrated systems – Integrated systems is a poorly defined computing technology that encompasses converged architecture, expert systems, and partially integrated systems as well as expert integrated systems. The major players in this space are Cisco, EMC, Dell, HP, IBM, and Oracle. While these systems have been on the market for more than a year now, revenues are still limited (depending upon whom one talks to, revenues may now exceed $1 billion globally) and adoption moving slowly. Truly integrated systems do result in productivity, time and cost savings and IT executives should be piloting them in 2014 to determine the role and value they can play in the corporate data centers.

Internet of things – More and more sensors are being employed and imbedded in appliances and other products, which will automate and improve life in IT and in the physical world. From an data center information management (DCIM), these sensors will enable IT operations staff to better monitor and manage system capacity and utilization. 2014 will see further advancements and inroads made in this area.

Linux/open source – The trend toward Linux and open source technologies continues with both picking up market share as IT shops find the costs are lower and they no longer need to be dependent upon vendor-provided support. Linux and other open technologies are now accepted because they provide agility, choice, and interoperability. According to a recent survey, a majority of users are now running Linux in their server environments, with more than 40 percent using Linux as either their primary server operating system or as one of their top server platforms. (Microsoft still has the advantage in the x86 platform space and will for some time to come.) OpenStack and the KVM hypervisor will continue to acquire supporting vendors and solutions as players look for solutions that do not lock them into proprietary offerings with limited ways forward. A Red Hat survey of 200 U.S. enterprise decision makers found that internal development of private cloud platforms has left organizations with numerous challenges such as application management, IT management, and resource management. To address these issues, organizations are moving or planning a move to OpenStack for private cloud initiatives, respondents claimed. Additionally, a recent OpenStack user survey indicated that 62 percent of OpenStack deployments use KVM as the hypervisor of choice.

Outsourcing – IT executives will be looking for more ways to improve outsourcing transparency and cost control in 2014. Outsourcers will have to step up to the SLA challenge (mentioned in the People and Process Trends 2014 blog) as well as provide better visibility into change management, incident management, projects, and project management. Correspondingly, with better visibility there will be a shift away from fixed priced engagements to ones with fixed and variable funding pools. Additionally, IT executives will be pushing for more contract flexibility, including payment terms. Application hosting displaced application development in 2013 as the most frequently outsourced function and 2014 will see the trend continue. The outsourcing of ecommerce operations and disaster recovery will be seen as having strong value propositions when compared to performing the work in-house. However, one cannot assume outsourcing is less expensive than handling the tasks internally.

Software defined x – Software defined networks, storage, data centers, etc. are all the latest hype. The trouble with all new technologies of this type is that the initial hype will not match reality. The new software defined market is quite immature and all the needed functionality will not be out in the early releases. Therefore, one can expect 2014 to be a year of disappointments for software defined solutions. However, over the next three to five years it will mature and start to become a usable reality.

Storage – Flash SSD et al – Storage is once again going through revolutionary changes. Flash, solid state drives (SSD), thin provisioning, tiering, and virtualization are advancing at a rapid pace as are the densities and power consumption curves. Tier one to tier four storage has been expanded to a number of different tier zero options – from storage inside the computer to PCIe cards to all flash solutions. 2014 will see more of the same with adoption of the newer technologies gaining speed. Most data centers are heavily loaded with hard disk drives (HDDs), a good number of which are short stroked. IT executives need to experiment with the myriad of storage choices and understand the different rationales for each. RFG expects the tighter integration of storage and servers to begin to take hold in a number of organizations as executives find the closer placement of the two will improve performance at a reasonable cost point.

RFG POV: 2014 will likely be a less daunting year for IT executives but keeping pace with technology advances will have to be part of any IT strategy if executives hope to achieve their goals for the year and keep their companies competitive. This will require IT to understand the rate of technology change and adapt a data center transformation plan that incorporates the new technologies at the appropriate pace. Additionally, IT executives will need to invest annually in new technologies to help contain costs, minimize risks, and improve resource utilization. IT executives should consider a turnover plan that upgrades (and transforms) a third of the data center each year. IT executives should collaborate with business and financial executives so that IT budgets and plans are integrated with the business and remain so throughout the year.

Predictions: People & Process Trends – 2014

Jan 20, 2014   //   by admin   //   Blog  //  Comments Off

RFG Perspective: The global economic headwinds in 2014, which constrain IT budgets, will force business and IT executives to more closely examine the people and process issues for productivity improvements. Externally IT executives will have to work with non-IT teams to improve and restructure processes to meet the new mobile/social environments that demand more collaborative and interactive real-time information. Simultaneously, IT executives will have to address the data quality and service level concerns that impact business outcomes, productivity and revenues so that there is more confidence in IT. Internally IT executives will need to increase their focus on automation, operations simplicity, and security so that IT can deliver more (again) at lower cost while better protecting the organization from cybercrimes.

As mentioned in the RFG blog “IT and the Global Economy – 2014” the global economic environment may not be as strong as expected, thereby keeping IT budgets contained or shrinking. Therefore, IT executives will need to invest in process improvements to help contain costs, enhance compliance, minimize risks, and improve resource utilization. Below are a few key areas that RFG believes will be the major people and process improvement initiatives that will get the most attention.

Automation/simplicity – Productivity in IT operations is a requirement for data center transformation. To achieve this IT executives will be pushing vendors to deliver more automation tools and easier to use products and services. Over the past decade some IT departments have been able to improve productivity by 10 times but many lag behind. In support of this, staff must switch from a vertical and highly technical model to a horizontal one in which they will manage services layers and relationships. New learning management techniques and systems will be needed to deliver content that can be grasped intuitively. Furthermore, the demand for increased IT services without commensurate budget increases will force IT executives to pursue productivity solutions to satisfy the business side of the house. Thus, automation software, virtualization techniques, and integrated solutions that simplify operations will be attractive initiatives for many IT executives.

Business Process Management (BPM) – BPM will gain more traction as companies continue to slice costs and demand more productivity from staff. Executives will look for BPM solutions that will automate redundant processes, enable them to get to the data they require, and/or allow them to respond to rapid-fire business changes within (and external to) their organizations. In healthcare in particular this will become a major thrust as the industry needs to move toward “pay for outcomes” and away from “pay for service” mentality.

Chargebacks – The movement to cloud computing is creating an environment that is conducive to implementation of chargebacks. The financial losers in this game will continue to resist but the momentum is turning against them. RFG expects more IT executives to be able to implement financially-meaningful chargebacks that enable business executives to better understand what the funds pay for and therefore better allocate IT resources, thereby optimizing expenditures. However, while chargebacks are gaining momentum across all industries, there is still a long way to go, especially for in-house clouds, systems and solutions.

Compliance – Thousands of new regulations took effect on January 1, as happens every year, making compliance even tougher. In 2014 the Affordable Care Act (aka Obamacare) kicked in for some companies but not others; compounding this, the U.S. President and his Health and Human Services (HHS) department keep issuing modifications to the law, which impact compliance and compliance reporting. IT executives will be hard pressed to keep up with compliance requirements globally and to improve users’ support for compliance.

Data quality – A recent study by RFG and Principal Consulting on the negative business outcomes of poor data quality finds a majority of users find data quality suspect. Most respondents believed inaccurate, unreliable, ambiguously defined, and disorganized data were the leading problems to be corrected. This will be partially addressed in 2014 by some users by looking at data confidence levels in association with the type and use of the data. IT must fix this problem if it is to regain trust. But it is not just an IT problem as it is costing companies dearly, in some cases more than 10 percent of revenues. Some IT executives will begin to capture the metrics required to build a business case to fix this while others will implement data quality solutions aimed at fixing select problems that have been determined to be troublesome.

Operations efficiency – This will be an overriding theme for many IT operations units. As has been the case over the years the factors driving improvement will be automation, standardization, and consolidation along with virtualization. However, for this to become mainstream, IT executives will need to know and monitor the key data center metrics, which for many will remain a challenge despite all the tools on the market. Look for minor advances in usage but major double-digit gains for those addressing operations efficiency.

Procurement – With the requirement for agility and the move towards cloud computing, more attention will be paid to the procurement process and supplier relationship management in 2014. Business and IT executives that emphasize a focus on these areas can reduce acquisition costs by double digits and improve flexibility and outcomes.

Security – The use of big data analytics and more collaboration will help improve real-time analysis but security issues will still be evident in 2014. RFG expects the fallout from the Target and probable Obamacare breaches will fuel the fears of identity theft exposures and impair ecommerce growth. Furthermore, electronic health and medical records in the cloud will require considerable security protections to minimize medical ID theft and payment of HIPAA and other penalties by SaaS and other providers. Not all providers will succeed and major breaches will occur.

Staffing – IT executives will do limited hiring again this year and will rely more on cloud services, consulting, and outsourcing services. There will be some shifts on suppliers and resource country-pool usage as advanced cloud offerings, geopolitical changes and economic factors drive IT executives to select alternative solutions.

Standardization –More and more IT executives recognize the need for standardization but advancement will require a continued executive push and involvement. In that this will become political, most new initiatives will be the result of the desire for cloud computing rather than internal leadership.

SLAs – Most IT executives and cloud providers have yet to provide the service levels businesses are demanding. More and better SLAs, especially for cloud platforms, are required. IT executives should push providers (and themselves) for SLAs covering availability, accountability, compliance, performance, resiliency, and security. Companies that address these issues will be the winners in 2014.

Watson – The IBM Watson cognitive system is still at the beginning of the acceptance curve but IBM is opening up Watson for developers to create own applications. 2014 might be a breakout year, starting a new wave of cognitive systems that will transform how people and organizations think, act, and operate.

RFG POV: 2014 will likely be a less daunting year for IT executives but people and process issues will have to be addressed if IT executives hope to achieve their goals for the year. This will require IT to integrate itself with the business and work collaboratively to enhance operations and innovate new, simpler approaches to doing business. Additionally, IT executives will need to invest in process improvements to help contain costs, enhance compliance, minimize risks, and improve resource utilization. IT executives should collaborate with business and financial executives so that IT budgets and plans are integrated with the business and remain so throughout the year.

IT and the Global Economy – 2014

Dec 23, 2013   //   by admin   //   Blog  //  Comments Off

RFG Perspective: There will be a number of global economic headwinds in 2014 that will mean slow or no growth around the world. The U.S. could creep up to three percent growth but the Affordable Care Act (Obamacare) implementation has a high probability of reducing growth to the 2013 level or less. This uncertainty will result in IT budgets remaining constrained and making it difficult for IT executives to keep current in technology, meet new business demands, and develop the skills necessary to satisfy corporate requirements.

Third quarter U.S. GDP gives the illusion that the U.S. economy is strengthening but that is hardly the case. The gains were in inventory buildups. Remove that and the economy of the United States mirrors that of many other countries. Europe remains weak and bounces in and out of recession while many of the so-called emerging markets are no longer bounding ahead. The BRIC nations (Brazil, Russia, India, China), whose growth had offset the weakness in the developed nations, are now underperforming. Growth in Brazil, India, and Russia has dropped significantly from the peak while China’s merely slipped into more normal numbers. Now that the U.S. Federal Reserve has begun its taper, these nations could tumble even more. This does not bode well for revenue growth, which, in turn, means tighter IT budgets.

In addition to the Federal Reserve’s actions overhanging the U.S. and global markets, Obamacare may add to the negative effect. The Affordable Care Act (aka Obamacare) is not that affordable and it seems the majority of individuals (and potentially corporations) are finding monthly payments are significantly higher, as are deductibles. This could slow the general economy even more if consumers and corporations are forced to hold back spending to cover basic healthcare costs.

The Bellwethers Struggle

There are three IT bellwethers for growth that we can look at to see how the world economy is fairing and how it is already impacting IT acquisitions. Some may say these companies – Cisco Systems Inc., Hewlett-Packard Co. (HP), and IBM Corp. – are no longer applicable in the new world of cloud computing but that is a false premise. These three firms are all heavily into the cloud and are growing rapidly in cloud/Internet related areas.

Cisco reported single digit revenue growth for 2013 year-over-year with revenues in the Asia Pacific area shrinking by three percent. While that is not bad, CEO John Chambers warned that revenues would decline eight to 10 percent in this quarter – its biggest drop in 13 years. One reason is that it is struggling in the top five emerging markets where revenues declined 21 percent. Brazil was down 25 percent; China, India and Mexico dropped 18 percent; and Russia slid 30 percent.

HP’s fiscal year 2013 showed similar revenue results – down by single digits. It had lower revenues in all regions and printing supplies slip four percent year-over-year. Printing supplies has been one of HP’s internal leading economic indicators, so this news is not good.

IBM’s third quarter revenues came in four percent under the previous year’s quarter, with all geographies down slightly or flat. But its growth markets revenues fell by nine percent and the BRIC revenues declined by 15 percent. There is a pattern here.

The collapse of the revenues in the emerging markets and BRIC nations is less a story of the bellwethers but of the countries’ declining economies. These countries and the U.S. were the engines of growth. Not any longer.

 RFG POV: 2014 has the appearance of being a less daunting year for IT executives than the past few years but economic, geopolitical and governmental disruptions could change all that almost overnight. Businesses may be able to avoid the global minefields that are lurking everywhere but the risk exposure is there. Therefore, it is highly likely that most CEOs and CFOs will want to constrain IT spending – i.e., flat, down or up slightly. Moreover, most budgets are reflections of the prior year’s budget with modifications to address the changing business requirements and economic environment. Therefore, IT executives can expect to have limited options as they work to meet new business demands, keep up with technology, and develop the skills needed to satisfy corporate requirements. It is time to innovate, do more with less again, and/or find self-funding solutions. Additionally, IT executives will need to invest in process improvements to help contain costs, enhance compliance, minimize risks, and improve resource utilization. IT executives should work closely with business and financial executives so that IT budgets and plans are integrated with the business and remain so throughout the year.

Major Advances in BPM and ERP

Dec 23, 2013   //   by admin   //   Blog  //  Comments Off

RFG Perspective: Business executives in small- and medium-sized (SMBs) as well as those in rapidly-changing large organizations can be at a disadvantage compared to their counterparts in relatively staid organizations. They must juggle a myriad of challenges, oftentimes without automated processes, usually because traditional ERP solutions either cannot be modified easily or the price point is prohibitive. These executives need business process management (BPM) and/or enterprise resource planning (ERP) solutions that will automate redundant processes, enable them to get to the data they require, and/or allow them to respond to rapid-fire business changes within (and external to) their organizations.

At the 2013 JRocket Marketing Analyst Road Show in Boston, Massachusetts three innovative disruptive technology vendors made announcements that can enable business executives in optimizing their business processes. These game-changing vendors are:
Apparancy, the sister company of Corefino and powered by Corefino’s 500 plus pre-built cloud-based Software-as-a-Service (SaaS) process framework, made its debut. Apparancy BPM solutions will initially target healthcare-related challenges faced by both enterprises and providers, and other areas in desperate need of quantum leaps in business process improvements.
SYSPRO is transforming the manufacturing/distribution sector through its unprecedented rapidly-deployed and specialized solutions for industry micro-verticals both on-premise and in the cloud.
UNIT4 is a global ERP solution provider that is expanding its offerings to Businesses Living IN Change (BLINC) ™; businesses that are changing rapidly due to mergers and acquisitions, global expansion, compliance, reorganization, etc.).

Apparancy

Today, executives must transform themselves into business process visionaries to guide their organizations into a sustainable and thriving future. Executives across the enterprise and in particular in the administrative side of healthcare, spend inordinate amounts of time on redundant and repetitive processes, distracting them from the real work at hand, which costs their organizations millions of dollars annually.

Market newcomer, Apparancy delivers BPM expertise to vendors with an automated, compliance-centric, and holistic business process workflow framework. Apparancy’s previously introduced cloud-based sister company Corefino, has already proven that its 500-plus process framework can save organizations from 25 to 50 percent or more over costs attached to their current workflow frameworks.

Apparancy customers get pre-built workflows to solve specific issues, such as compliance to Affordable Care Act (ACA) mandates, in a platform that sits on top of existing data systems, and that can then be continuously (and easily) updated and modified. The cloud-based SaaS model is proven (based on the five-year experiences of sister company Corefino) to support legal compliance while delivering substantial measurable ROI.

Apparancy’s workflow platform minimizes and simplifies state-, federal-, and industry- mandated compliance. The framework vets data and marries systems of record with systems of engagement to make business processes accurate and auditable. In essence, the Apparancy solution enables the any device, anywhere, anytime paradigm to be applied to pre-configured business process workflows – an industry first.

Executives must be able to confidently manage, sustain, and grow their organizations well into the future –as well as remain compliant. Apparancy can provide these organizations with the information they need anytime, anywhere as well as the detailed-as-necessary visibility into internal workflows without having to increase talent acquisition. Enterprise human resources (HR) executives and healthcare providers dealing with new legislation are key areas under extreme stress for which Apparancy will provide much-needed support.

SYSPRO

In a super-sized world mid-market business executives have learned that “bigger is not always better.” The answer to complex business problems is not a larger, more complex ERP solution. Moreover, one size does not fit all. This is especially true in manufacturing and distribution, in which consolidation, outsourcing, and off-shoring have become de rigueur. In addition, regulations change continuously and large retailer organizations often define the standards which SMB manufacturers/distributers must follow. This has become increasingly more challenging, driving many out of business.

For business executives to respond to change with agility as well as grow their businesses, they require an ERP vendor with solutions that go beyond simply targeting the manufacturing and distribution verticals. They need a vendor solution that drills down into the business, finance, technology, and regulatory challenges of specific micro-verticals, such as food and beverage, medical devices, electronics, or machinery and equipment.

At the 2013 Analyst Road Show, SYSPRO, a best-of-breed ERP solution for SMB manufacturers/distributers, announced the SYSPRO USA BRAIN BOOST program, part of the U.S. team’s successful “Einstein” market strategy. The four-point initiative continues to deliver on SYSPRO’s 35–year legacy of providing standards-based technology, multi-tiered architectures, and scalability along with an agile user interface. This enables businesses executives to continuously and swiftly adapt to market, standards, and compliance fluctuations.

United States manufacturing and distribution sectors have undergone sector-shattering changes. Many have been unable to adapt and have been forced to close. To remain in the game and be continuously viable, it is paramount for manufacturing and business executives to partner with a reliable, customer-focused, and future-directed vendor.

UNIT4

Business executives in fast-changing organizations or those with highly complex financial reporting structures are often at the mercy of rigid two-dimensional systems that do not allow for nimble access to, and manipulation of, financial data. In addition, many of the widely-installed ERP systems are prohibitively expensive to install, maintain, and then continuously modify to allow for this kind of agility.

The promise of post-implementation business flexibility/agility from larger ERP vendors has in many cases not been fulfilled. UNIT4 has found itself in the enviable position of being the replacement product for many high-end high-cost ERP solutions that failed to meet customer needs and expectations and cost customers millions of dollars. UNIT4 is a least-cost ERP/financial solution provider that has successfully shifted its model to the cloud (without a dip in revenues). The entire acquisition and installation cost for UNIT4 software was typically the same as that of a one-year provider license for a competitor ERP solution and that is just the beginning of the savings.

At the December 2013 Analyst Road Show UNIT4 made several announcements including the launch of two financial performance products in the North American Market (Cash Flow and Financial Consolidation) and a new change-supporting in-memory analytics solution. Recently IDC, a global market intelligence firm conducted a survey sponsored by UNIT4 of 167 IDC customers surrounding ERP purchase, implementation, maintenance, upgrade, and re-implementation. Significant observations of the survey include: “UNIT4 customers spent average of 55 percent less than the general ERP community on supporting business change…UNIT4 customers also reported having to make moderate to substantial system change only 25 percent of the time for mergers and acquisitions…” compared to 64 percent of non-UNIT4 ERP customers.

It behooves business executives to take a closer look at the direct and indirect costs – as well as the business impacts – associated with making changes to their ERP systems. Executives who wish to cost-effectively leverage their ERP systems should consider comparing the total cost of ownership (TCO) and return on investment (ROI) of their existing solution to that of an alternative ERP vendor.

Summary

The December 2013 Analyst Road Show showcased three disruptive technology vendors with three different foci: Apparancy is poised to have a significant positive and indispensible impact on the healthcare sector because it will provide healthcare executives (and enterprises conforming to new healthcare legislation) with a SaaS-deployed, streamlined, and cost-conscious solution. SYSPRO continues to be the champion of customized and quickly deployable ERP solutions for SMB manufacturers and distributers. UNIT4 solutions are designed to enable executives to embrace business change – simply, quickly, and cost-effectively.

RFG POV: All disruptive technology vendors herein are primed to enable their customers to not only remain viable and be competitive, but to experience sustained revenue growth. Business executives, whether across the enterprise, in healthcare, SMB manufacturing/distribution, or larger but fast-changing organizations must look beyond solving today’s business and IT problems. They must look to the future and be able to predict as well as respond nimbly and effectively to financial, market, and policy changes – well into the next two decades. Executives who possess business acumen should select long-term trusted vendor partners that will enable them to not only respond agilely to change but to do so faster than their competitors.

Blog written by Ms. Maria DeGiglio, Principal Analyst

Disrupt 2014

Dec 20, 2013   //   by admin   //   Blog  //  Comments Off

What did I learn (or what did you miss) at Disrupt 2014?

On Monday I attended The Robert Frances Group’s (RFG) Disrupt 2014 conference in New York City. The attendees enjoyed an over-the-top lunch and several hours of interesting and informative presentations. I would probably gain weight just recounting the menu, so instead I’ll review two of the presentations, and hint at a third.
 

1 – IT Disruptive Tech Trends and Directions

With this as the title, Cal Braunstein, CEO and Executive Director of Research for RFG, began by showing us how the pace of disruptive technology is increasing dramatically. According to AIIM, for more than 1/3 of the organizations studied, 90% of their IT spend adds no new value. It was the right start for Disrupt 2014. 

IT budgets are increasing linearly, at 1-5% per year, while the data we produce is increasing exponentially. Disruption isn’t just necessary, it is critical!

What is disruption? RFG defines it as industry leaders responding to the changes in customer demands and global economics by making fundamental changes in their approach to products, services, service delivery, engagement models, and the economic model on which their industry is based. You can read more about it here.

Cal focused on Storage Trends and Directions as a bellwether of disruption.

the disrupter, not the disruptedIntel conducted a study in 2012 that informed part of Cal’s presentation. When looking at a data center and its technology, servers from before 2008 comprised 32% of the hardware but they consumed 60% of the data center power budget. Here is the bad news. The old gear only provided 4% of performance capacityThe point is that the technology rate of change is exponential and any IT executive that keeps IT hardware past 40 months is costing his company money. Keeping current and transforming the data center over a 3 year cycle should be viewed as a strategic approach to modernizing a data center and containing costs. The improvement in processing power vs. power consumed is truly disruptive. You could pay for a data center renewal simply by scrapping the old power burning gear. RFG has identified the following optimization opportunities, which Cal presented.

Disrupt 2014

RFG has “Disrupt 2014″ marching orders for IT

  • IT departments must keep up with disruptive technologies
  • Don’t wait for next wave to become mainstream – the time to act is now
  • IT vision and strategy must include waves of change
  • IT needs business and user executives to share the vision, integrate & buy-in
  • Data Center transformation is a must and has a very positive business case
  • IT vendors must show a business case, not just talk about products & services

 2 – The Value of TCO Studies

disrupt 2014RFG Principal Analyst Gary McFadden presented on the value of Total Cost of Ownership (TCO) studies to support acquisition decisions. Gary’s Disrupt 2014 thesis was straightforward. According to Gary…

 

 

Businesses today are willing to invest provided they can see a decent return and a positive cost value proposition. Our TCO studies address the total cost of acquisition and ownership and the return on investment

In the RFG world, a TCO is a business-oriented custom research report that compares vendor offerings to those of their competitors. The report shows how the vendor’s solution could be financially superior to traditional approaches, and considers soft-dollar aspects as well as hard costs.

Knowing that, you should realize that all TCO studies are not created equal. Gary presented an overview of key TCO choices. Which one is right for you? Well, that depends on your needs. Your time horizon and buying cycle are chief determinants of which style of TCO would be right for you.

Disrupt 2014

 Making Sense of the Data

Gary correctly pointed out that no matter how comprehensive the data gathering phase of a TCO, the accumulated bulk of data is not actionable unless it is expressed in a useful manner. In the charts following, Gary contracted the details with an organized RFG presentation. The left chart holds mysteries, while the right one offers answers.

Disrupt 2014 Disrupt 2014

Remember that all TCOs are not created equal. Make sure that the one you pay for delivers the information you need, and the ones you use in your research follow a robust development methodology and contain the insights you require to enable your decision.

By the way, don’t think RFG delivers vendor-slanted product and service stories. That is not what they are about. Instead, RFG’s TCO reports are targeted at business and technology executives including CIOs, IT VPs/directors, CFOs, CMO, facilities executives, etc. They relate a business narrative designed to explain the business case for taking certain actions.

3 – Storm Insights “Mystery Presentation”

No, I can’t tell you much. All I can say is that Dr. Adrian Bowles has an exciting and disruptive information services offering that will change the way IT product and services vendors learn about their markets, and the way their markets learn about them. Don’t write off traditional research and advisory services yet, but stay tuned. I’ll report about the Storm Insights offering as the new year unfolds. For now, take my word for it… this is very interesting!

The Bottom Line

Disrupt 2014 delivered great value and a great lunch to the appreciative attendees. RFG extended the hand of friendship and those that took hold learned much and enjoyed the sense of community that appears when like-minded professionals gather to exchange ideas. When RFG asks you to attend one of their meetings, the smart answer is “YES”!

Published by permission of Stuart Selip, Principal Consulting LLC

 

 

 

 

 

The Butterfly Effect of Bad Data (Part 2)

Oct 16, 2013   //   by admin   //   Blog  //  Comments Off

Last time… Bad Data was revealed to be pervasive and costly

In the first part of this two-part post, I wrote about the truly abysmal business outcomes our survey respondents reported in our “Poor Data Quality – Negative Business Outcomes” survey. Read about it here. In writing part 1, I was stunned by the following statistic: 95% of those suffering supply chain issues noted reduced or lost savings that might have been attained from better supply chain integration. The lost savings were large, with 15% reporting missed savings of up to 20%! In this post, I’ll have a look at supply chains, and how passing bad data among participants harms participants and stakeholders, and how this can cause a butterfly effect.

Supply Chains spread the social disease of bad data

A supply chain is a community of “consenting” organizations that pass data across and among themselves to facilitate business functions such as sales, purchase ordering, inventory management, order fulfillment, and the like. The intention is that executing these business functions properly will benefit the end consumer, and all the supply chain participants.

In case the Human Social Disease Analogy is not clear…

Human social diseases spread like wildfire among communities of consenting participants. In my college days, the news that someone contracted a social disease was met with chuckles, winks, and a trip to the infirmary for a shot of penicillin. Once, a business analogy to those long-past days might have been learning that the 9-track tape you sent to a business partner had the wrong blocking factor, was encoded in ASCII instead of EBCDIC, or contained yesterday’s transactions instead of today’s. All of those problems were easily fixed. Just like getting a shot of penicillin. Today, we live in a different world. As we learned with AIDS, a social disease can have pervasive and devastating results for individuals and society. With a communicable disease, the inoculant has its “bad data” encoded in DNA. Where supply chains are concerned, the social disease inoculant is likely to be an XML-encoded transaction sent from one supply chain participant to another. In this case, the “bad transaction” information about the customer, product, quantity, price, terms, due date, or other critical information will simply be wrong, causing a ripple of negative consequences across the supply chain. That ripple is the Butterfly Effect.

The BUTTERFLY EFFECT

The basis of the Butterfly Effect is that a small, apparently random input to an interconnected system (like a supply chain) may have a ripple effect, the ultimate impact of which cannot be reasonably anticipated. As the phrase was constructed by Philip Merilees in back in 1972…

Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?

According to Martin Christopher, in his 2005 E-Book “Logistics and Supply Chain Management”, the butterfly can and will upset the supply chain.

Today’s supply chains are underpinned by the exchange of information between all the entities and levels that comprise the complete end-to-end [supply chain] network… The so-called “Bullwhip” effect is the manifestation of the way that demand signals can be considerably distorted as a result of multiple steps in the chain. As a result of this distortion, the data that is used as input to planning and forecasting activities can be flawed and hence forecast accuracy is reduced and more costs are incurred. (Emphasis provided)

Supply Chains Have Learned that Bad Data is a Social Disease

Supply chains connect a network of organizations that collaborate to create and distribute a product to its consumers. Specifically, Investopedia defines a supply chain as:

The network created amongst different companies producing, handling and/or distributing a specific product. Specifically, the supply chain encompasses the steps it takes to get a good or service from the supplier to the customer

Managing the supply chain involves exchanges of data among participants. It is easy to see that exchanging bad data would disrupt the chain, adding cost, delay, and risk of ultimate delivery failure to the supply chain mix. With sufficient bad data, the value delivered by a managed supply chain would come at a higher cost and risk. Consider the graphic of supply chain management and the problems our survey respondents found in their experiences with supply chains and bad data. Click on the graphics to see them in full size. bad data is a social diseasebad data is a social disease  

Bad data means ambiguously defined data, missing data, and inaccurate data with corrupt or plugged values. These issues lead the list of supply chain data problems found by our survey respondents.

Would you be pleased to purchase a new car delivered with parts that do not work or break because suppliers misinterpreted part specifications? Do you remember the old 1960 era Fords whose doors let snow inside because of their poor fit? Let’s not pillory Ford, as GM and Chrysler had their own quality meltdowns too. Supply chain-derived quality issues like these kill revenues and harm consumers and brands.

Would you like to drive the automobile that contained safety components ordered with missing and corrupt data? What about that artificial knee replacement you were considering? Suppose the specifications for your medical device implant had been ambiguously defined and then misinterpreted. Ready to go ahead with that surgery? Bad data is a social disease, and it could make you suffer!

Bad Data is an Expensive Supply Chain Social Disease

Bad data is costing supply chain participants big money. As the graphic from our survey indicates, more than 20% of respondents to the Supply Chain survey segment thought that data quality problems added between 6% and 10 %. to the cost of operating their supply chain. Almost 16% said data quality problems added between 11% and 20% to supply chain operating costs. That is HUGE! The graphic following gives you survey results. Notice that 44% of the respondents could not monetize their supply chain data problems. That is a serious finding, in and of itself. bad data quality is a social disease

THOUGHT EXPERIMENT: CUT SUPPLY CHAIN MANAGEMENT COSTS by 20%

Over 15% of survey respondents with supply chain issues believed bad data added between 11% and 20% to the cost of operating their supply chain.Let’s use 20% in our thought experiment, to yield a nice round number.

Understanding the total cost of managing a supply chain is a non-trivial exercise. Industry body The Supply Chain Council has defined The Supply Chain Operations Reference (SCOR®) model. According to that reference model, Supply Chain Management Costs include the costs to plan, source, and deliver products and services. The costs of making products and handling returns are captured in other aggregations.

For a manufacturing firm with a billion dollar revenue stream, the total costs of managing a supply chain will be around 20% of revenue, or $200,000,000 USD. Reducing this cost by 20% would mean an annual saving $40,000,000 USD. That would be a significant savings, for a data cleanup and quality sustenance investment of perhaps $3,000,000. The clean-up investment would be a one-time expense. If the $40,000,000 were a one time savings, the ROI would be 1,3333%.

But wait, it is better than that. The $40,000,000 savings recurs annually. The payback period is measured in months. The ROI is enormous. Having clean data pays big dividends!

If you think the one-time “get it right and keep it right” investment would be more like $10,000,000 USD, your payback period would still be measured in months, not years. Let’s add a 20% additional cost or $2,000,000 USD in years 2-5 for maintenance and additional quality work. That means you would have spent $18,000,000 USD in 5 years to achieve a savings $200,000,000. That would be greater than a 10-times return on your money! Not too shabby an investment, and your partners and stakeholders would be saving money too. This scenario is really a win-win situation, right down the line to your customers.

The Corcardia Group believes that total supply chain costs for hospitals approach 50% of the hospital’s operating budget. For a hospital with a $60,000,000 USD annual operating budget, a 20% savings means $12,000,000 USD would be freed for other uses, like curing patients and preventing illness.

Even Better…

For manufacturers, hospitals, and other supply chain participants, ridding themselves of bad data will produce still better returns. By cleaning up the data throughout the supply chain, it is likely that costs would go down while margins would improve. The product costs for participants could drop. Firms might realize an additional 5% cost savings from this as well. Their return is even better.

What does the Supply Chain Community say about Data Quality?

A 2011 McKinsey & Company study entitled McKinsey on Supply Chain: Selected Publications, which you can download here, the publication “The Challenges Ahead for Supply Chains” by Trish Gyorey, Matt Jochim, and Sabina Norton goes right to the heart of a supply chain’s dependency on data, and the weakness of current supply chain decision-making based on that data. According to the authors:

Knowledge is power The results show a similar disconnection between data and decision making: companies seem to collect and use much less detailed information than our experience suggests is prudent in making astute supply chain decisions (Exhibit 6).   For example, customer service is becoming a higher priority,and executives say their companies balance service and cost to serve effectively… Half of the executives say their companies have limited or no quantitative information about incremental costs for raw materials, manufacturing capacity, and personnel, and 41 percent do not track per-customer supply chain costs at any useful level of detail.

Here is Exhibit 6 – a graphic from their study, referenced in the previous quote.

bad data quality is a social disease
McKinsey & Company Challenges Ahead for Supply Chains – Exhibit 6

Andrew White, writing about master data management for the Supply Chain Quarterly in its Q2-2013 issue, underscored the importance of data quality and consistency for supply chain participants.

… there is a growing emphasis among many organizations on knowing their customers’ needs. More than this, organizations are seeking to influence the behavior of customers and prospects, guiding customers’ purchasing decisions toward their own products and services and away from those of competitors. This change in focus is leading to a greater demand for and reliance on consistent data.

White’s take-away from this is…

…as companies’ growing focus on collaboration with trading partners and their need to improve business outcomes, data consistency—especially between trading partners—is increasingly a prerequisite for improved and competitive supply chain performance.   As data quality and consistency become increasingly important factors in supply chain performance, companies that want to catch up with the innovators will have to pay closer attention to master data management. That may require supply chain managers to change the way they think about and utilize data.

Did everyone get that? Data Quality and Consistency are important factors in supply chain performance. You want your auto and your artificial knee joint to work properly and consistently, as their designers and builders intended. This means curing existing data social disease victims and preventing the recurrence and spread.

The Bottom Line

At this point, nearly 300 respondents have begun their surveys, and more than 200 have completed them. I urge those who have left their surveys in mid-course to complete them!

Bad data is a social disease that harms supply chain participants and stakeholders. Do take a stand on wiping it out. The simplest first step is to make your experiences known to us by visiting the IBM InfoGovernance site and taking our “Poor Data Quality – Negative Business Outcomes” survey.

When you get to the question about participating in an interview, answer “YES”and give us real examples of problems, solutions attempted, success attained, and failures sustained. Only by publicizing the magnitude and pervasiveness of this social disease will we collectively stand a chance of achieving cure and prevention.

As a follow-up next step, work with us to survey your organization in a private study that parallels our public InfoGovernance study. The public study forms a excellent baseline for us to compare and contrast the specific data quality issues within your organization. Data Quality will not be attained and sustained until your management understands the depth and breadth of the problem and its cost to your organization’s bottom line.

Contact me here and let us help you build the business case for eliminating the causes of bad data.

Published by permission of Stuart Selip, Principal Consulting LLC

Bad Data is a Social Disease (Part 1)

Oct 15, 2013   //   by admin   //   Blog  //  Comments Off

Organizational bad data is a social disease easily passed to your business partners and stakeholders

With 200 completed responses in our “Poor Data Quality – Negative Business Outcomes” survey, run in conjunction with The Robert Frances Group, the IBM InfoGovernance Community, and Chaordix, it is safe to say that bad data is a social disease that can spread easily and quickly. Merriam-Webster defines a social disease as

a disease (as tuberculosis) whose incidence is directly related to social and economic factors

OK, that definition works for the bad data social disease. In this case, the social and economic factors enabling and potentiating this disease include

  • Business management failing to fund and support data governance initiatives
  • IT management failing to sell the value of data quality to their business colleagues,
  • Business partners failing to challenge and push-back when bad data is exchanged
  • Financial analysts not downgrading firms that repeatedly refile 10-Ks due to bad data
  • Customers not abandoning firms that err due to bad data quality and management

No doubt you can think of other reasons why the bad data social disease spreads. Was the source of this disease a doorknob? Not according to our survey respondents, who said the chief sources of bad data social disease are inaccurate, ambiguously defined, and unreliable data, as shown in the graphic following. As you can see, in the chart immediately previous, those are not the only causes of the disease. bad data quality is a social disease Social diseases negatively affect the sufferer, their partners, and the community around them. According to our respondents: bad data is a social disease

  • 95% of those suffering supply chain issues noted reduced or lost savings that might have been attained from better supply chain integration.
  • 72% reported customer data problems, and 71% of those respondents lost business because the didn’t know their customer
  • 71% of those suffering financial reporting problems said poor data quality cause them to reach and act upon erroneous conclusions based upon materially faulty revenues, expenses, and/or liabilities
  • 66% missed the chance to accelerate receivables collection
  • 49% reported operations problems from bad data, and 87% of those respondents suffered excess costs for business operations
  • 27% reported strategic planning problems, with 75% of those indicating challenges with financial records, profits and losses of units, taxes paid, capital, true customer profiles, overhead allocations, marginal costs, shareholders, etc.)

This response from our survey respondents highlights the truly dismal state of data quality across a spectrum of organizations.

What is in Part 2?

In next week’s post, we’ll examine some of our survey results specific to bad data and the supply chain. A successful supply chain requires sound internal data integration and equally sound data exchange and integration across chain participants.A network of willing participants exchanging data is fertile ground for spreading the social disease of business. Expect a thought experiment about wringing the bad data quality costs out of supply chain management, and see what some supply chain experts think about the dependency of effective supply chains on high quality data.

The Bottom Line

Believe that bad data is a social disease and take a stand on wiping it out. The simplest first step is to make your experiences known to us by visiting the IBM InfoGovernance site and taking our “Poor Data Quality – Negative Business Outcomes” survey. When you get to the question about participating in an interview, answer “YES”and give us real examples of problems, solutions attempted, success attained, and failures sustained. Only by publicizing the magnitude and pervasiveness of this social disease will we collectively stand a chance of achieving cure and prevention. As a follow-up next step, work with us to survey your organization in a private study that parallels our public InfoGovernance study. The public study forms an excellent baseline for us to compare the specific data quality issues within your organization. You will not attain and sustain data quality until your management understands the depth and breadth of the problem and its cost to your organization’s bottom line. Bad Data is a needless and costly social disease of business. Let’s move forward swiftly and decisively to wipe it out!

Published by permission of Stuart Selip, Principal Consulting LLC

As the consulting industry changes will you be the disrupter, not the disrupted?

Sep 30, 2013   //   by admin   //   Blog  //  Comments Off

Will you be the disrupter, not the disrupted? This is the question that came to mind as I read Consulting on the Cusp of Disruption, by Clayton M. Christensen, Dina Wang, and Derek van Bever, in the October 2013 issue of the Harvard Business Review (HBR). With an online subscription, you can read it here. Disruption means industry leaders are responding to the changes in customer demands and global economics by making fundamental changes in their approach to services, service delivery, engagement models, and the economic model on which their industry is based. EvolutionAs an example of disruption, the HBR authors open by discussing the McKinsey & Company move to develop McKinsey Solutions, an offering that is not “human-capital based”, but instead focuses on technology and analytics embedded at their client. This is a significant departure for a firm known for hiring the best and the brightest, to be tasked with delivering key insights and judgement. Especially when the judgment business was doing well. The authors make the point that the consulting industry has evolved over time.

Generalists have become Functional Specialists Local Structures developed into Global Structures Tightly Structured Teams morphed into spider webs of Remote Specialists

the disrupter, not the disruptedHowever, McKinsey Solutions was not evolutionary. In its way, it was a revolutionary breakthrough for McKinsey. While McKinsey Solutions’ success meant additional revenue for the firm, and offered another means of remaining “Top of Mind” for the McKinsey Solutions’ client, the move was really a first line of defense against disruption in the consulting industry. By enjoying “first mover advantage” McKinsey protected their already strong market position, and became the disrupter, not the disrupted.

What is the classic pattern of disruption?

According to Christensen, et al,

New competitors with new business models arrive; incumbents choose to ignore the new players or flee to higher-margin activities; a disrupter whose product was once barely good enough achieves a level of quality acceptable to the broad middle of the market, undermining the position of longtime leaders and often causing the “flip” to a new basis of competition.

Cal Braunstein, CEO of The Robert Frances Group, believes that IT needs a disruptive agenda. In his research note, Cal references the US auto industry back in the happy days when the Model “T” completely disrupted non-production line operations of competitors. But when disruption results in a workable model with entrenched incumbents, the market once again becomes ripe for disruption. That is exactly what happened to the “Big 3″ US automakers when Honda and Toyota entered the US market with better quality and service at a dramatically lower price point. Disruption struck again. Detroit never recovered. The City of Detroit itself is now bankrupt. Disruption has significant consequences.

Industry leaders may suffer most from disruption

In his work “The Innovator’s Solution” HBR author Clayton M. Christensen addressed the problem of incumbents becoming vulnerable to disruption, writing

An organization’s capabilities become its disabilities when disruption is afoot.

The disruption problem is worse for market leaders, according to Christensen.

No challenge is more difficult for a market leader facing disruption than to turn and fight back – to disrupt itself before a competitor does… Success in self-disruption requires at least the following six elements: An autonomous business unit… Leaders who come from relevant “schools of experience”… A separate resource allocation process… Independent sales channels… A new profit model… Unwavering commitment by the CEO…

  So, it will be tough to disrupt yourself if you are big, set in your ways, and don’t have the right CEO.

Being the disrupter, not the disrupted

The HBR authors characterized three forms of offering consulting, ranging from the traditional “Solution Shop” to “Value-added Process Businesses” and then to “Facilitated Networks”. The spectrum ranges from delivering pronouncements from gifted but opaque expert practitioners charging by the hour through repeatable process delivery charging for delivered results to dynamic and configurable collections of experts linked by a business network layer. In my experience, the expert network form is the most flexible, least constrained, and most likely to deliver value at an exceptional price. It is at once the most disruptive, and presently the least likely form to be destabilized by other disruptive initiatives.

The Bottom Line

Sir-William-Osler If you are in the consulting industry and you don’t recognize that disruptive forces are changing the industry and your market’s expectations as you read this, you will surely be the disrupted, not the disrupter. On the other hand, disrupters can be expected to provide a consulting service that will deliver much more value for a much lower price point. We are talking here of more than a simple process improvement 10% gain. It will be a quantum jump. Like McKinsey, that may come from embedding in some new solution that accelerates the consulting process and cuts costs. Now is the time to develop situation awareness. What are the small independent competitors doing? Yes, the little firms that you don’t really think will invade your market and displace you. Watch them carefully, and learn before it is too late. Those readers who man those small, agile, and disruptive firms should ensure they understand their prospect’s pain points and dissatisfaction with the status quo. As physician Sir William Olser famously said “Listen to your patient, he is telling you his diagnosis”. Do it now!    >
-
reprinted by permission of Stu Selip and Principal Consulting LLC.

Pages:12345»