The Travesty of Security Questions
It used to be a good idea. Ask something that was immediately obvious and only knowable by YOU or a very few people – and make it the last line of defense for a password reset or some other high-security function. Mother’s maiden name. City you were married in. Make and model of your car.
Until, like everything, it gets carried to an extreme. Like for instance, the Apple ID reset process. Let’s look at these security question…
I suppose the standard response would be (if the answer is not readily available) MAKE SOMETHING UP and remember it! But wasn’t it the purpose of these security questions NOT to have to remember ONE MORE THING. Heck, if that were the case I wouldn’t have forgotten my password in the first place!
…and then there was the concept of REUSE. Use a question that is personal information, but whose use is fairly standard across websites (mothers maiden name, for example). But then that, of course, leaves one open to the possibility of a breach at one site is a breach of all of them.
This, we at Gartner, would classify as a wicked hard problem of identity and authorization. And there is no answer yet (two factor authentication aside). The Achilles heel of virtuality.
These first job questions – do you mean my news route, part time job after high school, full-time job after college? The first one that actually paid me? or paid me a salary?
That first album I purchased? duh, maybe it was Jimi’s “Live at the Fillmore East”, which explains why I can’t remember a dang thing….
We’ll talk about wicked hard problems like that, and more, at this year’s Catalyst Conference in San Diego.
Yahoo Goes Big and Buys Tumblr
In what has been a relatively slow year for big acquisitions in the tech world, Yahoo’s board has approved a billion-dollar-plus layout to buy Tumblr, a blogging platform that has over 100 million blogs as of mid-May 2013. For Yahoo, the Tumblr purchase represents its first big move under Marissa Mayer’s regime and an attempt to move its brand back into the consumer-tech conversation.
Why Tumblr and why now are sensible questions and lack clear answers save for the common meme that Yahoo is trying to “buy hipsters.” While that may be fun fodder for the Twitter cybergabfest, Mayer and Yahoo no doubt have some concrete plans beyond PR value and such pat answers as “because they were available” and “we bought Tumblr before one of our competitors did.” After going through a checklist of Yahoo needs and Tumblr strengths, here is one realistic path—content platforms.
The content platform play: Previous Yahoo execs have always talked about Yahoo’s “content buckets”—original, licensed and user generated (i.e. Yahoo Contributor Network) as one of its key strengths, and as a leading vertical content portal for such areas as sports, finance and gossip/celeb news, Tumblr makes sense as a content management-meets-curation platform. Such a move would allow consumers, brands and marketers (i.e. content marketing) to curate Yahoo’s syndicated content (original and licensed) and use those pictures, videos and stories to create personal and professional Tumblr pages. Coke and Campbell’s are just two of countless brands that have created Tumblr sites to showcase their wares and tell their stories to consumers. The content platform plan allows Yahoo a few revenue paths including a freemium service option and a venue for targeted advertising. It also allows Yahoo to get even more mileage out of one of its key “cool” brands—Flickr (another e-less product).
One has to applaud Marrisa Mayer’s “go big or go home” move and the notion that Yahoo’s board is standing behind this purchase will (reportedly) full support. A successful integration of Tumblr into Yahoo’s forward-looking strategy could end a losing streak of purchases and fizzled launches which includes Maven Networks and Geocities as well as Livestand and Yahoo 360. Mayer’s biggest challenge—something Yahoo has not shown to be a strength—is to integrate Tumblr into Yahoo’s product set and get the cash register ringing again.
Certainly one cannot overlook the power of Yahoo attempting to make itself cool and relevant again, but this must be more than a very expensive PR move. The Tumblr purchase has to be followed up with other moves to bolster its mobile and social strategies. The yodel may be on its way back—time will tell.
Acoustic Mirrors and Contextual Communications
This past weekend my daughter (who I have referenced in several posts) graduated from NC State University–Summa Cum Laude and Phi Betta Kappa and heading on to UNC Chapel Hill for Medical School(proud parent addition of information irrelevant to this story). During the weekend, she took several out of town visitors on a tour of the NC State Campus.
One of the interesting items there are a set of acoustic mirrors:
As you can see, the mirrors are pretty far apart (I’d guess 50 yards). If you sit in one (like my daugther is on the left) and someone else sits in the other, they can hear each other talking, even if you are speaking in a whisper. It was definitely an interesting and unique experience.
I went searching for more information on these mirrors. As it turns out, they were originally used in Britain to help detect the sound of enemy aircraft (before the invention of radar). Now, they are mainly used in science museums to demonstrate how to focus sound.
As marketers, it would be great if all of our audiences were sitting in acoustical mirrors, ready and willing to listen to our messages, not matter how quietly we speak them. But that is not the case. Instead, we need to find ways to get our messages heard through a din of noise and competing messages. Expecting an acoustical mirror effect is unrealistic.
There is a great model for communications that was originally put forth by Don Schults, Stanley Tannebaum, and Rober Lauterborn in the book, The New Marketing Paradigm: Integrated Marketing Communications, which was published in 1996. There model which still applies today is shown here (with a recreated graphic):
The model shows that awhen a sender wants to communicate with a receiver the ability for that message to be received is impacted by how much of a shared field of experience the two parties have and the impact of noise in distracting the receiver. Furthermore, if feedback can be provided, that enables the sender to verify for the receiver that they got the right message or to make adjustments until the communication is received properly.
Acoustic Mirrors are a lot like fields of experience–they create an environment for more focused communications. In today’s world, marketers must use context to increase the overlap in fields of experience and reduce the likelihood of noise being introduced into the communications stream.
In effect, context is the modern version of acoustic mirrors and a key focus area for all forms of business communication.
Announcing: Cool Vendors in Consumer Goods, 2013
At Gartner, one of our favorite reports as Analysts is the Cool Vendors report each team does every year. By definition, Cool Vendors are small vendors that offer innovative products or services. The innovation may be in how the product works or in what can be accomplished by those using the product or service. Essentially, it is our chance to pick vendors that are doing unique and cool things in a particular area of our coverage.
This year, my colleague Dale Hagemeyer and I identified three vendors in our published report Cool Vendors in Consumer Goods, 2013. These Cool Vendors share a set of common innovative approaches that seek to bring more value to consumer goods companies. They all demonstrate knowledge of the consumer goods space, have embedded that knowledge into their applications, and seek to offer solutions that can drive more personalized offers and provide more insight into shopping behavior and retail execution:
- Fifth Dimension: Leveraging 3D visualization tools and data to drive new levels of retailer and consumer goods manufacturer collaboration. www.fifth.uk.com
- QThru: Innovatively engaging consumers with context-aware, personalized offers at the point of purchase in retail stores via a self-scanning mobile application www.qthru.com
- Quri: Using crowdsourcing to help consumer goods companies monitor in-store execution. www.quri.com
Even if you are unable to access our report, definitely check out the websites for these companies and see what makes their services cool.
We are always on the look out for new vendors that offer exciting technology possibilities than enable sales and marketing for our consumer goods clients. We welcome any ideas or suggestions as we scout for vendors for next year’s report and in our ongoing research.
Reminiscing on AADI London 2013
I am heading back to Germany after a few fun days in London at the Gartner AADI conference. We had a good show, with a high level of attendee enthusiasm and interaction. I had good 1-on-1s and a very engaged (if small) set of workshops on .NET in the cloud, private PaaS, and cloud application architecture. I have a few quick takes on my experience.
Professional Effectiveness is a Big Issue
At lunch today I presented to a set of interested attendees on Gartner for Technical Professionals (GTP) – our value proposition, resources available to clients, and so on. In a nutshell: as GTP analysts our mission is to accelerate time to competency for the solution architects using our material. When you get a new project, we provide objective advice and how-to guides that will shorten your effort’s time-to-value.
Among our technical coverage areas (which include application platforms, data centers, cloud computing, mobility, social software, and so on) we also include one focused on “soft skills” which we call Professional Effectiveness. The idea here is to help the propellerheads communicate more effectively, manage up better, manage their careers more competently and just generally function at a higher level than their purely technical skills might suggest. The attendees to our GTP lunch focused laser-like on this topic, sharing with us numerous challenges around staff development. This is a great way to leverage GTP research and I was glad to see it. As a broader trend, I think it bodes well for the next generation of techies to have engaged managers who care deeply about professional development. And, it’s neat to see how our PE agenda is helping to shape at least a small number of the next generation of IT Leaders.
.NET Shops are Interested in a Hybrid Cloud Future
My workshop on “Choosing .NET Platforms for Cloud Computing” was my best session of the conference. Workshops are a great forum, providing an extended time (90 minutes) for analyst expository on a topic, incorporating several exercises and group discussion. The intimate environment of a large boardroom makes for a crackling atmosphere when the audience is truly engaged, as the attendees of my session were. We worked through the framework I developed for my soon-to-be-published paper on assessing .NET alternatives.
Capacity utilization rose to the top as the most important consideration among session attendees, followed closely by reduced time-to-market. Further, all reported very strong interest in hybrid cloud deployment models that preserve the enterprise’s option to choose runtime location at a later date. Basically, .NET shops represented in my session are looking to get elastic, cloud-native solutions out the door as fast as possible but without compromising portability by mitigating vendor lock-in as much as possible (or as much as is possible once you’ve already locked yourself to Microsoft by virtue of the choice of .NET). These were interesting findings for me, and I didn’t see as much desire for lift-and-shift style cloud adoption as I expected.
API Management is Really Popular
As usual, 1-on-1 topics were diverse and interesting, but the most often asked question had to do with Web APIs. API management came up frequently but it was good to hear more clients asking about API design issues, which I think are so much more important than which API product suite you end up buying (or consuming as a cloud service). API management has been a hot space lately and there was no shortage of vendor representation at the event, but attendees were also very interested. I saw a great appetite for my API design paper – which I hope will prevent some GTP clients from making major design mistakes.
Did you attend Gartner AADI? What were your most important take-aways?
Order From Chaos: Creating A Standard For Mobile App Management And Secuity
If there’s one thing the mobile industry is known for is standards. There’s a lot of them. In networking technology you have multiple Wi-Fi standards in use, 80.211 a, b, g, n, ac. In wide area wireless there are GSM, CDMA, WCDMA, LTE. And for mobile OSes you have iOS, Android, QNX, Windows Phone, et al. The problem for mobile OSes is there are too many standards–and none that have the weight in the market to become de facto (as driven by adopters) like what happened in the PC world when it was Microsoft vs. IBM (who won that one?? ). When enterprises could dictate their own individual standards, this wasn’t an issue. But in today’s world of BYOD, this is only getting worse, especially when it comes to mobile software and app management. Each mobile platform has its own app SDK and with consumerization, very little thought has gone into securing and managing these consumer apps for enterprise users. But as enterprise users adopt these apps for work, this needs to change.
I covered some of the strategies for implementing app management and security in my January research note on containerization. Using one method, where there is a proprietary SDK from the multitude of MDM vendors, what we call app specific has been around for a couple years now. But at best only 40-50 apps have been developed this way. The problem is the management SDK is proprietary to each vendor so a management tool can only support its specific (hence app specific) app. Plus pre-existing apps need to be rewritten. Most app developers have held off of committing because of this. Another method is to wrap the app, but getting access to the binary, especially for third-party apps found on public app stores is difficult–and still proprietary to the application wrapper for management. What’s needed is some type of standard that app developers could use, that all MDM and app management vendors could integrate into. Of course that would mean getting all those vendors to agree on one method–probably some type of open source mobile app management SDK. Then these vendors could compete on managing and securing apps, not on wooing app developers to use their standard. Another method would be to use app wrapping, but seperate the admin functions and APIs from the wrapping technology itself. This does have the advantage of quickly adapting existing apps without a lot of recoding.
One well known MDM vendor, MobileIron, is beginning to create an open SDK standard it’s calling (for now) the Open App Alliance, which was mentioned last month on brianmadden. It’s hoping to go public with the details in the next few weeks, but the alliance should include some big app providers, app development tool vendors and maybe even some adopter companies at the start. MobileIron would rather compete on its MDM platform than spend the time convincing adopters and developers to use its proprietary app SDK. One thing missing, at least for now, is other MDM vendors. In the end, their buy-in is essential for this to succeed. Maybe if enough adopters and app providers hop on board, this may convince other MDM vendors to head in this direction. Many of the big MDM vendors I talked to around this are interested, but have not committed yet.
It remains to be seen whether this will have the momentum to move forward. There’s a lot of work left to do and not a lot of time to do it, but in my mind, something needs to be done to alleviate the fragmentation of the mobile technology, get apps manageable and secured– and this is at least a step in the right direction.
Two Inconvenient Truths about IT Compliance
I am very pleased to announce that my first document Achieving IT GRC Sucess has published this week and is now available to Gartner for Technical Professionals subscribers. The research and writing process led to many interesting conversations about governance, risk management and compliance with clients and colleagues. Let’s examine two “inconvenient truths” about IT compliance…
IT Compliance Doesn’t Exist!
IT clearly has a significant amount of compliance work that it performs – no doubt about it, especially in highly regulated industries. IT, of course, exists solely to support business objectives. The compliance requirements that IT fulfills are derived from those business objectives. Here are a few quick examples:
- PCI compliance results from a business desire to accept payment via credit cards.
- IT and Information Security have a role in achieving overall SOX compliance, which results from an organizations status as a US based public corporation (and desire to remain so).
- IT requirements for HIPAA result directly from an enterprise need to process confidential healthcare data.
Yet, it seems that in many enterprises IT compliance is seen as separate from and in conflict with “meeting business requirements”. This is in spite of the fact that none of the examples above can be achieved without significant efforts from business partners themselves.
So, why is this interesting? In a word: “Resources” – time and money. In most organizations, funding and prioritization of compliance activities is very hard to come by. Much of this may be a perspective problem, as IT organizations often do not include compliance efforts in project prioritization processes side-by-side with other initiatives – that may be a lost opportunity.
Compliance is Risk Management – Just NOT YOUR Risk Management!
Compliance is not a substitute for risk management. Compliance requirements are the result of an external group’s anxiety regarding risk. Usually the external group is either a government body or large commercial concern. Let’s take a second look at the three examples from above:
- SOX is the result of anxiety over the risks of financial reporting errors and a response to a number of major corporate accounting scandals.
- PCI is the result of the payment card industry’s desire to ensure minimum standards for the safeguarding of payment card information and transactions.
- The HIPAA Privacy Rule is a direct result of concerns regarding the use (or misuse) of individual healthcare records by the general public.
In all of these cases, the compliance requirements are designed to address issues to reduce risk to a level that is tolerable by the government or commercial group. The fact that compliance requirements are designed to provide broad and general coverage often results in cases where enterprises could more effectively manage the risk using controls specific to their environment and situation. The fact that this is often not acceptable to examiners is a significant source of frustration.
The last decade has seen compliance mandates become more risk oriented and begin to include risk assessments and control design as a part of the compliance process. Regrettably, all too often new compliance requirements continue to come in the form a checklist.
There are few options available to us to improve the situation. Participation in regulatory rule making is a time consuming process that rarely contributes to success in our “day jobs”. Improving compliance and regulatory rule making is a long game, and the best thing for many enterprises to do is to share their risk management and control design approaches with examiners. Educating the examiners who often have significant influence over the rule making process make be the most productive approach.
What Every Digital Brand Manager Should Embrace – Right Now
Amplifying your story over digital media is what branding in the 21st century is all about – and it’s been a hugely disruptive trend.
When Larry Light was CMO of McDonalds he found parallels between marketers, editors and writers. Just as writers collaborate with their editors to develop the many facets of a story, marketers develop stories with their customers. Happy Meals was one outcome of Light’s storytelling exercise, convincing him that brand managers shouldn’t tell product stories, rather stories about the outcomes their products produce for customers. Mr. Light soon coined the term “brand journalism” to describe this new twist on brand communications.
In the digital age, opportunities to tell brand stories multiply. Buyers for example, freely share stories of their favorite brands everyday – by the thousands – so much that some pundits say marketers have ceded control of their story to others, whether they like it or not. This is hugely overstated. While we can’t control what customers say, we can certainly coach them.
Marketo for example, knows its story well – to help marketers generate high quality prospects; the type that convert to customers – faster. Some users say Marketo’s solution has accelerated their sales cycles two to three times. When this story plays out revenue accelerates and sales costs decline. And so, Marketo simply asks customers to tell their stories within this context. Customers happily comply (see Marketo’s video testimonials).
Brand managers at P&G, Starbucks, BMW and Mercedes have all gotten hip to this storytelling idea, coaching customers to share their own experience, within the context of the company’s brand promise. Look on any auto enthusiast site; BMW owners can’t wait to share exactly what BMW wants them to share: driving excitement. Some do it willing; some are nudged a bit by BMW marketers.
There are of course, those customers who decline coaching – and we welcome those too, for it’s those customers that produce the kind of surprises we like, especially product usage we hadn’t even thought of (as General Mills found when many customers said Cheerios was their favorite bedtime snack).
Like any disruption, find a way to embrace it. Let your customers tell your story over Facebook, Twitter, LinkedIn, the blogosphere, and community forums. Coach them (or not) but let those stories flow – for stories engage, stories inspire. Stories help us remember, which after all, is what brand awareness is all about.
So long and thanks for all the fish
This all started back in January 2009 and 718 posts latter with more than 3,700 comments it is time to conclude this blog. During the past four years, the blog has sought to present and share ideas about technology in the enterprise, the role of the CIO and the changing nature of IT leadership. During that time we have explored a range of issues from digital technology to management practices and even the introduction of a magic quadrant covering magic.
It has been great fun and I have learned much from all of the smart people who have taken the time to read, comment and contribute to the blog. My deepest apologies as the blog often featured grammar that would make an English teacher cringe. I will admit to not really proofreading every post all the time as the ideas and the desire to publish them got ahead of prudent respect for the reader. Please accept my apologies.
I have provided a few links below to the posts that garnered the greatest interest, comments or are worth remembering.
Leading in Times of Transition: the 2010 CIO Agenda - the most links via Bitly
Gartner announces a new magic quadrant – the most comments
The Nature of Change is Changing: A new Pattern — the most comments for a serious post
An IT value sampler for the holidays
Chief Digital Officer, What type does your organization need?
Digitalization creates new dimensions for disruption
Everyone will recall the source of the title of this post from Douglas Adams’ book of the same name. Part of the Hitchhiker set of books that should be on everyone’s must read list. I took his title as the title for this post not because things are ending, but because they are always beginning. That beginning in 2009 started with a simple question about IT budgets and now I turn the conversation over to my peers at Gartner and to the community in general.
Thank you for your time, attention, energy, knowledge and experience all shared on this blog. It is not mine; it is ours. It is also my privilege to share.
If You Give an Intern Control of Your Facebook Page...
Let’s press play and turn on the speakers for this one because it’s nearly summer time and while our interns come to join us, we should listen to some relaxing music. Plus, your intern might look like Chad or Jeremy in ’64.
If you give an intern control of your Facebook page, one or two or three of many things could happen. He/she might…
Establish the customer-facing social media strategy you lack…
Get angry with you and post something inappropriate…
Build a foundation on which you can express your brand’s brand on social media…
Build a foundation for your brand that is not really your brand’s brand…
Flourish under your leadership and become a change agent and leader in your organization…
Go work for a competitor, while still having admin access to your Facebook page…
As intern season springs upon us and the doe-eyed young’ns enter our white-walled dwellings, let’s keep in mind that the tasks we assign them are tasks that have an impact on both the intern and the organization. When I was a wee intern, some of the folks at my former employer gave some of us a camera, they gave others the equipment to podcast, and others the opportunity to blog. That one day where someone believed in us and entrusted us to go out and express how we felt about the company we were working for changed our understanding of “career” and what we could do to impact a “stiff” organization.
But also keep in mind that these interns don’t know the history behind your brand. They don’t necessarily understand your industry and your competitors. They might not even understand your product. And worse, they will be gone at the end of the summer. Don’t have them start something you can’t carry on.
And with those few words of wisdom, everyone enjoy your summer
Let the music play out…
xMatters Acquisition Gives the Bamboo Incident Management Mobile App a Home
On 12 February 2013, emergency/mass notification services (EMNS) vendor xMatters purchased the intellectual property of Bamboo, an enterprise-level incident management mobile app from Deloitte Australia, for an undisclosed amount. Members of Deloitte’s risk practice are assisting in the full transition as are application developers from the Bamboo team employed in the build-out. This acquisition of Bamboo, a mobile app for incident management, should appeal to companies looking to integrate emergency/mass notification services and offline access to recovery plans in a mobile platform.
Bamboo has now found a software development home to enhance its business continuity management software. Gartner believes xMatters has the most opportunity to grow Bamboo adoption by supporting the importation of Microsoft Word and Excel files as well as a SharePoint Web service API for those who do not use business continuity management planning (BCMP) tools now. Gartner also believes xMatters should consider evaluating its EMNS pricing strategy to make it more competitive with the rest of the market for increased adoption of Bamboo by prospects that do not already have an EMNS tool.
xMatters adds a mobile app that supports push technology for recovery plan updates, role-based and offline recovery plan access, and GIS-enabled tracking of all capabilities used for real-time incident management. Integration with the xMatters IT alerting system may be a future enhancement.
Before this acquisition, Gartner observed limited Bamboo adoption by our clients, who cited additional costs compared to perceived benefits; Australia-only product support with uncertain future support from Deloitte (which is not known for mobile application development); and limited business continuity management tool integration.
In both current and combined forms, Bamboo powered by xMatters lacks many of the capabilities of the larger BCMP market, particularly related to planning functions, including:
- Business impact analysis
- Risk assessment
- Recovery plan development, maintenance and exercising
But the offering could appeal to xMatters customers that lack a mobile app for real-time incident management.
BCMP tool customers: If you are looking for EMNS and enhanced real-time incident management capabilities through a mobile device, encourage your BCMP vendor to integrate with xMatters.
EMNS tool prospects: Consider xMatters because it now has an enhanced mobile app for offline recovery plan access, emergency contact list dialing and GIS for resource tracking — all used for real-time incident management support.
BCMP vendors that only have mobile Web browser access: If you are looking for an EMNS tie-in, either integrate with xMatters or enhance your mobile app to provide push technology for recovery plan updates, role-based and offline access to plans through the mobile device, and EMNS integration.
xMatters EMNS competitors: Enhance your mobile app to support push technology for recovery plan updates, role-based and offline recovery plan access and GIS-enabled resource tracking. (EMNS leaders currently support GIS-enabled resource tracking.)
Existing Bamboo customers: Discuss with your EMNS vendor whether it will continue supporting Bamboo, as it may be a direct competitor to xMatters.
“Best Practices: EMNS Implementation Advice” — EMNS implemented without a well-considered plan can hurt the constituencies that rely on these services for everything from basic safety to basic survival. By Roberta Witty and John Girard
“Market Analysis in Depth: EMNS Magic Quadrant” — Buyers of EMNS should use this research to guide their vendor selection projects. By Roberta Witty, John Girard and Catherine Goldstein
Ask Jeff Brooks if 'Dem Polls Important (Potent Polls)
Shout out to anyone who can figure out which lyricist the title of blog is tied to. (Hint: His pep talks turn into pep rallies).
Short but sweetie today – my colleague Jeff Brooks and I have been on our grind through the initial round of IT Service Support Tool Vendor Magic Quadrant demos, with a handful more to complete this week. We’ve also been doing our fair share of inquiries, strategic advisory sessions and client engagement days through this first quarter, and while we get together often to discuss client chatter, industry trends and the most recently watched episode of ”Girls”, we recognize that there is clearly a need to go out and get more information. To help extend our reach, we have recently launched an IT service desk survey.
We’re after the IT leader buying trends, concerns, and framework considerations to supplement our research, and we need your help.
Our love doesn’t cost a thing, but we are fully aware that your time is valuable, so for that we would like to offer anyone who completes the survey a Gartner research note of their choosing to be emailed directly to them.
The survey can be found here: http://tinyurl.com/bhtv2ej
But wait, there’s more! (#BillyMaysVoice)
Jeff Brooks will be attending the SITS2013 show (http://www.servicedeskshow.com/) in London, April 23-24. Jeff will establish an official Gartner presence in the Expo Hall, and anyone interested in a meet up can find him there. In addition to insight, Jeff will be handing out Gartner schwag, while supplies last, and trust me – you don’t want to miss out. Jeff is also schedule to present and host separate sessions; one of which will be a fiery debate on whether or not ITIL has an expiration date. Surely Jeff didn’t think having that debate in the UK was a great idea, but he informed me that he’s watched enough Jerry Springer to know to duck flying chairs at his head if necessary. Gartner Analyst and UK resident Ian Head will also be in the building, so if things get really dicey, Jeff is covered.
So please – feel free to take the survey, either online or in person at SITS in a few short weeks!
Either way, thank you in advance for your participation!
It’s Nice When Old Clothes Still Fit
It’s been three months since I re-joined Gartner and I thought it would be a good time to re-introduce myself via the Gartner Blog Network. For those not familiar with my background, a brief history of my analyst experience:
- Meta Group 1996-2005
- Burton Group 2005-2010
- Gartner Jan 2010-July 2010
Over those years, I’ve covered a range of topics but they all intersected with collaboration and social software – with a particular deep research focus on topics related to social networking. I enjoyed my experience on the vendor-side of the world while at Cisco. It was interesting to see how technologies are brought to market and some of the opportunities and challenges vendors face when moving into new markets. But research is my passion and I’m happy to be back doing something that is so deeply ingrained in the way I look at the world. There are some differences in the way I approach this role now then before that I’d like to share:
- Theory: I’m in the middle of a Master’s program in Media Studies at The New School. The experience has reinforced my intent to anchor my research to scholarly sources whenever possible and relevant.
- Culture: While media and technology plays an important role in the life of an analyst, knowing more about the cultural context of how people go about their routines through the eyes of its participants can reveal tremendous insight – it’s changed the way I think about and approach research.
- Practice: Which leads me to better understanding the things people do, their patterns in everyday life (how people participate and contribute), the social processes that influence how people take action (or not), and how all of these dynamics are applied in a work environment to get something done.
In a way, I’m more exploratory and observant in areas that might appear to be far removed from technology yet those experiences influence how people go through everyday life as a consumer, customer, employee, teammate, community member, management leader, etc. Everyone has multiple identities. All human interaction occurs in a network context. While I find these topics important from a research perspective, don’t worry – I express my views in the language of our business and IT clients. I do pass along articles related to these topics (e.g., anthropology, ethnography, design, social networking, identity, social capital, etc), I find interesting via Twitter (@MikeGotta) if you’re interested.
Shifting away from my academic side, I’ll be focusing on a variety of challenges and opportunities faced by IT leaders involved in collaboration and social software strategies. I look forward to hearing from you and sharing my views on:
- How to put together an internal collaboration or social strategy
- How to approach the cultural aspects of teaming, community-building, and social networking
- What impact can new approaches towards research and design have on adoption of collaboration and social applications? To get an idea of where I’m heading with this, see this report recently published: Leverage Design Ethnography to Boost Enterprise Social Networking Success (Note: you need to have the appropriate client access rights.
- How to approach the business case. What can be done to express the value of collaboration and social solutions in terms that show business value (e.g., key performance indicators, ROI, etc.
- Can integration of collaboration and social technologies into applications and process improve their use and create better business outcomes? How can we design environments that enable people to better mobilize their professional networks to improve the effectiveness of informal work processes
General questions such as:
- The impact of social on the employee life-cycle
- Why is social networking important from a management and employee perspective
- Applying design ethnography to improve use of social application
- Re-thinking cultural change through the eyes of its participants
- Mobility aspects of social and collaboration strategies. This is an area I’ll be ramping up on. I’m especially interested in how we can improve the research and design aspects of social and collaborative apps.
Specific technology questions related to:
- Social networking applications and platforms, including profiles, social graphs, activity streams, social objects, and social analytics
- Expertise location and Q&A applications within the enterprise
- Various vendor collaboration and social platforms
- Mobile social and collaboration apps (Note: ramping up)
In terms of some long-range questions I have for myself – here’s what I’m thinking about – somewhat academic but my findings would be expressed in business and IT terms:
- How do social structures emerge (e.g., teams, communities, networks) and the influence of management, culture, and media on those relationships?
- How do people cultivate and mobilize their social networks?
- How do we encourage a more participatory employee culture? What impact do media literacies have on people’s ability to contribute effectively?
- How does mobile (as a more intimate form of computing) affect how people communicate, share, and build relationships?
As I investigate these open-ended research inquiries, I recast my findings into an enterprise context, taking advantage of quantitative and qualitative studies and all of the various resources and interactions I have here at Gartner.
That’s it. I hope this helps outline areas where I can help. Sorry if it’s a bit long.
BTW, if you are an organization applying various qualitative research approaches (like ethnography), I’d very much like to hear about your experience – whether it’s externally or internally focused.
Of Budgets, Short-sightedness and Special Pleading
On May 14th the Australian Commonwealth government brought down its national budget for the year ending June 2014. So the IT pundits in Australia are busy pontificating about the impact of the national budget on the IT industry. Fair enough. That’s their job. It happens in all countries, states and provinces.
But as we listen to the pundits’ comments, and those of the IT industry groups and lobbyists, let’s bear in mind a few facts that are applicable worldwide. (My turn to pontificate!)
- The IT industry is not special. It’s an important industry, but no more so than any other. It employs a good number of people, as do other industries. It adds value, as do all viable industries. When well deployed and used, its products and services can help make its users more productive and effective; so can good HR consultants, well targeted financial services, the right plant & equipment, cost-effective transportation, the education sector and good public policy makers … and one could go on and on.
- Any special treatment for the IT industry, tax concessions or handouts, are going to be paid for by other taxpayers in the end, and many of those are themselves in the IT industry, or would spend money on IT.
- Seeking government “incentives” for investing in IT or in IT companies should be unnecessary if it’s such a great industry. (And if it’s not, why invest?)
- If an industry needs to be “promoted”, something must be wrong. Why can’t its sales and marketing people do that?
- In those countries where the government funds much of the education, a focus on specific IT vocational or technical skills is short-sighted. Those are already out of date by the end of the course. IT providers, like all business people, should be looking for the economy in which they operate to be providing a pool of educated people. Those are people with an education that prepares them for the rapidly changing and challenging business environment, people who can think and learn, and continue to do so when the world changes, as it will.
- It isn’t just specific IT spending initiatives in the budget that we should look for to see what the government itself will spend on IT. Every government activity, every existing and new program requires IT to make it work. Nothing much happens without IT. Therefore, commenting on the supposedly good news or disappointing news about IT initiatives misses the point.
These blog posts will continue to discuss the business of IT Services.
When Cloud Storage Protects Your Data From Yourself
There are many benefits to using a cloud service for personal file storage, such as the ability to access your data from most any internet-connected device or, in many cases, eliminating the need for local backup. Data kept in the cloud also protects it from a variety of local misfortunes, which may sometimes include your own actions.
I did not jump onto the personal cloud storage bandwagon until recently. I generally accessed data from home, and I preferred to manage backups myself. However, I recently did an about-face and now use cloud storage as part of my personal backup strategy. Why? It’s not because I didn’t want to manage my data anymore (I still back up my files locally too). It’s not because I need to access it from anywhere (I still access it primarily from home). It’s because I’m human and I make mistakes.
This decision came about when I painted myself into a corner migrating from an aging laptop to a new desktop. (Yes, a desktop. At the price I’m willing to pay for the options that I want, a desktop still beats a laptop.) The first time I booted up, my desktop had two, mirrored 1TB hard drives. After copying critical files from my laptop and setting up the new system just the way I like it (solid black wallpaper, no desktop icons), I promptly backed everything up to an external USB hard drive. I then had five copies of my most important data: one on my laptop’s internal drive, two on my desktop’s internal drives (mirrored copies), one on my external drive, and one older copy on a USB stick (the last one may be a bit paranoid). Surely, I thought, this would protect me from most any form of conceivable disaster? Unfortunately, I failed to consider the single point of failure: me.
About two weeks after the desktop arrived, I received an exciting package in the mail: my SSD, which I immediately slotted into my desktop. However, during the process of getting the OS properly installed on the SSD, certain actions were taken and certain mistakes made. At one point during the evening, I had an improperly formatted SSD (very much my fault), a split of the internal mirror rendering the disks unreadable (mostly my fault), and a reformatted external drive (also very much my fault). In one fell swoop, three of my five copies of data were now inaccessible, leaving my aging, prone-to-overheating laptop as the only up-to-date copy of my personal data. The root cause analysis? User error.
Eventually, I managed to untangle the mess I had made and ended up with an SSD containing my OS, hard drives with my data (no longer mirrored), and an external hard drive with a backup of both. However, as the complexity of my environment grew, even a slight misstep could cast my data into oblivion. Shortly after saving my data from myself, I found a cloud service to supplement my local backups, and all of my local data is kept in sync with that service. Critically, the copy that resides on the cloud is all but immune to the many ways I could accidentally ruin my own local copy.
I would consider myself a reasonably computer-literate person, but that doesn’t prevent me from making mistakes, which in this case had nearly catastrophic consequences. A cloud backup provides insurance not only against technical failures, but also against our own mistakes.
What should technologies like IBM's Watson and Google's Knowledge Graph mean to you?
I’ve seen the future … and now it’s within grasp. It’s going to impact your life and your work before this decade is out.
We just published a note entitled Exploit the Intersect of IBM’s Social Business and Solution Selling Strategies.
A part of that note, probably one quarter, dives into what has been fascinating me for many, many years. There’s only so many features you can stick into an email program or a content editing tool. Particularly if you’re text-centric. Where do we go after the 177th version of a personal productivity tool suite? The 34th iteration of instant messaging? The 500th document database? So much of what we’re doing now is reinventing and refining what we were already doing in the pre-client-server era. Back in the late 80′s at Digital Equipment, we had a vision and architecture for compound documents in GUI environments…how many more iterations of that do we really need?
There’s more coming, very different. It’s not a new kerning tool. Or the next great slide transition mechanism. It’s about the rise of smart assistants. Natural language processing. Semantic analysis. Massive parallelization. Rule-based systems with machine learning. Pattern recognition and matching. Marry that to the scale of what Google can do and what IBM, with Watson and co-development partners can do.
Start with Google. Witness, for example:
Then look at IBM. Witness, again, for example:
- Watson — as in Ken Jenning’s declaration on Jeopardy “I, for one, welcome our new computer overlords“
- Consider the Watson “Oncology Treatment Advisor”. IBM co-developed it with Wellpoint who is now selling it. It’s narrowly focused today on lung and breast cancer cases. It digests hundreds of millions of pages of published research and other reference data, considers the patient’s data (such as diagnostic test data, prior treatments and broader history) and suggests to the clinician a list of alternative treatments to consider. The list is ordered — based on a calculated likelihood of success — and provides access to all the relevant information the system has considered in constructing each recommendation.
- IBM is also working with Memorial Sloan-Kettering Cancer Center and others on additional, very narrow but high-value use cases in various medical fields. Other co-development projects are under way in other industries.
- This isn’t just about game shows and slights of programmer-hands!
Google Now represents analysis of a longitudinal array of information about what you do, where you go, what you say, who you pay attention to and whom you interact with across time so that Google (and, for that matter, Siri, it’s cross-valley competitor) can predict what you will need in your current context — before you even know it. There’s a staggering amount of personal information it can mine.
This isn’t just the Apple Knowledge Navigator reborn.
And then there’s Watson and the techniques IBM is using to evolve future generations of its capabilities…
I see radical change coming — glorious and depressing, liberating and enslaving, enriching all and only a few. This isn’t necessarily the optimistic world of Brynjolfsson and MacAfee’s Race Against The Machine..
How is this going to affect your organization (IT)? Your enterprise? industry? economy? society? What do you counsel your children to pursue as a career? as their passion?
Enter Web-scale IT
In a research note that was published yesterday, Gartner introduced the term “web-scale IT.” What is web-scale IT? It’s our effort to describe all of the things happening at large cloud services firms such as Google, Amazon, Rackspace, Netflix, Facebook, etc., that enables them to achieve extreme levels of service delivery as compared to many of their enterprise counterparts.
In the note, we identify six elements to the web-scale recipe: industrial data centers, web-oriented architectures, programmable management, agile processes, a collaborative organization style and a learning culture. The last few items are normally discussed in the context of DevOps, but we saw a need to expand the perspective to include the changes being made with respect to infrastructure and applications that act to compliment DevOps capabilities. So, we’re not trying to minimize DevOps in any way because we view it as essential to “running with the big dogs,” but we’re also saying that there’s more that needs to be done with respect to the underlying technology to optimize end-to-end agility.
In addition, while the term “scale” usually refers to size, we’re not suggesting that only large enterprises can benefit. Another scale “attribute” is speed and so we’re stating that even smaller firms (or departments within larger IT organizations) can still find benefit to a web-scale IT approach. Agility has no size correlation so even more modestly-sized organizations can achieve some of the capabilities of an Amazon, etc., provided that they are willing to question conventional wisdom where needed.
Web-scale IT is not one size fits all as we don’t want to replace one IT dogma with another. In true pace-layered fashion, use the approach to IT service delivery that works best for your customers. Gartner suggests that so-called “systems of innovation” which are applications and services needing high rates of change are the more likely initial candidates, but IT organizations are urged to experiment to see what makes sense for them.
Stay tuned for more on web-scale IT from Gartner in the future!
SAP Sapphire 2013– A Few Thoughts…
Overall, Sapphire 2013 for me was a “coming out” party for SAP Hana. She is a debutante and now seeking dance partners. The event did not introduce any new step change in technology or SAP future. Overall the event seemed to be a call to action for the market to get into SAP Hana.
As an old Supply Chain Management business use, I can appreciate the value and promise of what SAP Hana could bring to the market. There were several examples of innovations coming to market. In a past life I tried to bring to market an innovative solution to a problem that required a calculation that had not been previously been adopted in the market. It failed; turns out some organizations do’nt actually want to know the true cost of some activities. And I remember the days of Fast MRP. Many factories cannot change their schedules minute by minute due to physical constraints. So while I was generally excited, the real world has some very real constraints – as well as political challenges – that will slow adoption of SAP Hana. For industries that have information as their product (insurance, banking, financial services) and those other industries that have elements of information-rich processes (parts of healthcare), SAP Hana has some great promise.
Here are some other observations:
- There was continued focus on expanding the ecosystem of partners and opportunities for SAP Hana
- Cloud and SAP Hana – this was, for me, a bit of non event. It was hyped quite a bit for it seemed the main benefits are related to lower TCO for IT. There was some talk of improved access to innovation for the business, but this seemed to me to be part of the SAP Hana message, not the cloud message. I may have missed something, but for me, SAP Hana is the main message here – not cloud
- Cloud, SAP Hana and business transformation. what SAP did NOT do was explore or talk about what SAP Hana and cloud, coupled with Ariba’s business network, could do. And I mean with innovation. Just connecting a network is not different; designing and developing new processes (multienterprise apps) that replace processes (apps) behind the firewall, now that is disruptive. But this was not the message. I guess that innovation will remain with smaller firms that SAP has yet to acquire – if they ever well. Why would you eat your own children?
- Zero latency between OLTP and OLAP – clearly a pending, near future for us. Very exciting. Could eradicate the need for any kind of data warehouse. Could help unify analytics with the business processes – thus killing of BI as we know it. And making “process is king” dominant, over the BI world. Of course I am ahead of myself. Big Data will drive demand for more data warehousing – even if the DW is no in SAP Hana….. so perhaps its the on-premise, on disk data warehouse that will lose its primacy…..
SAP Information Steward, SAP MDG, and information governance. Some information on a new release of SAP Information Steward (in ramp up) that attempts to put a financial face on the impact on business of poor/bad data. Could be very interesting, and innovative. We wait to see more.
User Experience is top priority. Well, this has happened before. I forget the catchy name for the last effort. However, SAP Fiori does look promising. The demo was not that useful -but hte idea of a unified platform for UI development on HTML 5, even using Chrome, sounds promising. We just don’t want too many enthusiastic (or citizen) developers going crazy. We need smart artists to get involved.
Finally, my college Nigel Rayner asked of Hasso and Vishal, as part of the Executive Q&A, “When and how will SAP support real time analytics across its various business applications (come built, some acquired) and analytic data warehouses?” The answer given was “today”. There was a little give and take, along the lines of “So does that mean there is a logical data model?” that attracted a, “yes”.
This topic is a kind of holy grail conversation. In fact only just the other week I was party to one of those massive email chains at work where analysts chime in to discuss how process, analytics and data are fighting for ownership and hegemony over each other. We should have explored the issue with Hasso and Vishal. The answer was not really targeted at the question really being asked. Of course, any vendor can build an integration for a range of given applications. We have been doing that for years. But:
a) How and when will SAP provide the tools and capability to support operational data governance and stewardship across heterogeneous applications and warehouses, even if they all exist in SAP Hana – and more importantly, when there is a hybrid model? This is required to assure the integrity of any real time analytics platform or solution.
b) How will current customers that have invested in current technology migrate (and pay for it, willingly)?
Both these questions are not easy to answer. In the first case, no vendor has yet solved this. There are numerous attempts going on in the industry. Master Data Management as a discipline is part of this dialog. As is semantic discover and modeling. As is business glossary. As is logical data warehouse and logical operational data store. As is data quality. The fact is even SAP would struggle to demonstrate this. SAP is, in this regard, like many other vendors. Well aware of the issue and complexity; but has built a successful business without having the need to solve this Holy Grail.
The second question adds the dimension of revenue to the same topic. I meet with SAP customers each week that tell me, “We are on a 3 (or 5) year program to consolidate x ERP systems (many are not SAP) to one (of a few) SAP ERP systems.” There is virtually no appetite to invest significantly in any game changing technologies. Investments with SAP Hana will therefore likely be opportunistic at best. The point is that the current investment has to yield some value. Now business executives will hope that their returns will not get eaten up by aggressive competitors that were late to the ECC on premise argument, and who jump early onto SAP Hana. SAP wins both times of course – so it’s all revenue to them J
All in all this event was well worth the time investment. Good exposure to customers; access to some executives; and even a couple of detailed demos. But what comes next? How can Oracle come up with the SAP Hana killer?
Big Content Needs More Metadata
In recent posts I’ve introduced the notion of Big Content as shorthand for incorporating unstructured content into the Big Data world in a systematic and strategic way. Big Data changes the way we think about content and how we manage it. One of the most important areas requiring a fresh look is metadata. Big Content expands the definition of metadata beyond the traditional tombstone information normally associated with documents (title, author, creation date, archive date, etc.). While these elements are necessary and remain foundational to both effective content management and Big Data, more is required. Big Content metadata encompasses any additional illuminating information that can be extracted from or applied to the source content to facilitate its integration and analysis with other information from across the enterprise. This expanded definition results in a three-tiered metadata architecture for Big Content.
At the bottom level of the architecture, a core enterprise metadata framework provides a small set of metadata elements that are applicable to the majority of enterprise information assets under management. These elements are often drawn from a well known standard set of elements such as the Dublin Core but can include whatever common elements are useful to the enterprise. This common framework provides the unifying thread that will facilitate locating content from across the enterprise, making an initial assessment of its relevance and of submitting it to the content ingestion pipeline.
The second layer of the Big Content metadata architecture consists of domain specific elements that are not necessarily applicable to all enterprise content, but are useful to a particular area such as a brand, product or department. At this level, common metadata often exists under different labels depending on where it is created and which department owns it. This increases its value and utility to that department but makes it more difficult to leverage for content integration and analysis. To reconcile domain metadata it is often necessary to create a metadata map that resolves naming and semantic conflicts.
The top layer of the metadata architecture consists of application specific metadata. This is additional information about content that is only relevant to the use-case at hand and the application facilitating its execution. As such it is not created or stored in the content management systems hosting the source content. It is created solely for the purpose of structuring and augmenting the content to be utilized within a vertical application in the Big Content environment.
Throughout the entire Big Content lifecycle ensuring metadata quality and integrity is of the highest importance. Quality measures must go beyond simply reconciling field names. It is important that the steps taken to enrich and refine content are applied consistently. If some dates are not normalized, entity extraction is incomplete, or terminology is not reconciled, the accuracy of the data behind the insights comes into doubt. As a result any analysis and its findings become questionable. Metadata represents a significant upfront investment and ongoing requirement when large amounts of content are involved. Never the less, it is a critical factor in effective content management and the key enabler of the Big Content ecosystem.
Resetting the Definition of IT-GRC at Gartner
IT-GRC is essentially enterprise GRC functions (workflow, data repository, regulatory mapping, etc) focused on IT specific needs. The only reason we have IT-GRC is because, traditionally, the original GRC vendors were focused on addressing SOX and other global financial integrity regulations and were terrible at IT requirements. That gap is closing however.
For the last two years, IT-GRC has started to bifurcate into IT-related GRC functions and security operations functions. These market changes have caused us to reset Gartner’s use of the term IT GRC to provide useful guidance to our clients in selecting appropriate technologies for their requirements.
GRC is the most worthless term in the vendor lexicon. Vendors use it to describe whatever they are selling and Gartner clients use it to describe whatever problem they have. See my previous post Why I Hate the Term GRC.
In 2013, there is little evidence that security technology data is being used in any material or comprehensive manner to directly support senior IT and business leadership in decision making. However, there is an important evolution in the prioritization and remediation of vulnerability and security configuration management data using business context that is changing vulnerability management and other security operations use cases. This evolution will be covered separately from IT GRC technologies.
Gartner experience on client and reference calls has indicated that IT GRC needs fall roughly in two areas. The first supports oversight and governance functions that typically bridge IT information to support IT and business leadership for reporting and decision making. This is present in use cases such as vendor risk management, policy management, integrated risk reporting and risk assessment. The second supports information security operations requirements through the centralization of security technology data. This is present in use cases such as vulnerability management, continuous monitoring and the management of technology-centric compliance requirements such as Payment Card Industry Data Security Standard (PCI DSS).
Consider a metaphor where a horizontal line is used to separate IT from non-IT business needs (see figure below). The first area can be described as "above the line," and the second area can be described as "below the line"
Using patch management as an example, the operations functions that monitor patch states, prioritize and guide remediation are all within the first line of defense. They are considered below the line and not within the definition of IT GRC. The governance functions that use patch information to rate business units on patching effectiveness to guide risk-related decision making are part of the second line of defense. They are above the line and considered to be a part of core IT GRC activity.
IT GRC technologies and providers for above-the-line use cases will be published in the latest MarketScope for IT GRC. Below-the-line requirements will be addressed, in part, as an extension of vulnerability management. There is no hard definition for below-the-line use cases that have been excluded from IT GRC because this is an evolving set of solutions that include traditional IT GRC vendors and vulnerability management vendors.
Our new definition of IT-GRC
IT GRC technologies are used primarily to bridge IT-related data in support of senior IT and non-IT decision making. This is composed of functions for mapping controls into control objectives, survey capabilities, workflow to support non-IT decision making, and non-IT executive reporting.
The use cases for security operations will no longer be referenced as IT GRC at Gartner and will be considered an extension of vulnerability management research for the benefit of IT operations. This is composed of functions for the import of technical data from third-party products, workflow to support prioritization and IT remediation activities, and an IT asset database supporting IT decision making.
IT GRC is composed of functions to support non-IT decision making and non-IT executive reporting:
- Controls and policy mapping.
- Survey capabilities.
- GRC asset repository.
- IT risk evaluation and dashboards.
The functions supporting data import from third-party security tools, such as vulnerability assessment and security configuration management, remain a part of IT GRC. However, these functions are primarily used in support of the below-the-line security technology use cases.
These changes seem to have everyone in a tizzy. But here’s the bottom line: Security operations is security operations. Gartner is not going to call that IT GRC. So there.
Follow me on Twitter (@peproctor)
2013 &ldquo;ALM&rdquo; MQ
We’ve begun work on the update to the Magic Quadrant for ALM. We are subtly shifting our terminology for the market from Application Lifecycle Management to Application Development Lifecycle Management. We feel this is a more accurate depiction of what the tools in this space are focused on.
Participants this year all have at least 200 active customer installations, $5M per year in revenue. These vendors are also actively brought up by our clients on calls and thus have created some recognition at the enterprise level. There are many other products in the market that are effective for specific roles or at the project level, our goal is to look at tools that are effective at scale and most of the vendors being covered have installations with 1000 or more users.
We will be using the same criteria as the prior edition of the MQ. We will however, have an additional use case for product development which supports the 2011 Maverick research by Matt Hotle on the shift from Projects to Products.
We will also update several surrounding documents and are working on pieces around the market sub-segments.
The beginnings of the CRM Customer Engagement Center
Each April or May for the past 12 years we have published the Magic Quadrant for CRM Customer Service Contact Centers,” but this year we’ve dared to disturb the universe by replacing it with The Magic Quadrant for the CRM Customer Engagement Center. (If you are a Gartner client you can find it at http://www.gartner.com/document/2482521. The question my colleagues are asking is: in a world where you are rated, in part, by how many people click on your research, WHY abandon a very highly read document for one that no one has ever heard of? Well, everyone from the Midrash to Karl Marx to Chaim Potok have written that all beginnings are difficult. Yes, but why begin at all?
The genesis of the Customer Engagement Center idea is evolutionary, and in no way revolutionary. Call Centers ruled the ’70s through 1990s, and Contact Centers have ruled ever since. Dinosaurs too ruled for a long time, and in the same way, Contact Centers are having their own asteroid collision theory now. Why? Because just responding to a customer’s immediate request for help is grossly insufficient. Phone, email, IVR, chat are all fab when designed properly. But that is reactive for the most part. Today, were we to drop our organizational handcuffs, we have the ability to extend our reach into Social Media. Is a customer or prospect who posts to Facebook, or to a community site, or out to Twitter, any less deserving of our attention?
The major transition is from “Contact” to “Engagement.” It will take most organizations a long time to relinquish the idea that the Digital Marketing group does the listening, but no one does responding – at least not systematically. It is not marketings job, but neither is it the job, in most organizations, for Customer Support to engage on social media. And what about consistent business rules? Today there is one set of rules for traditional contact, and another set of quasi-al fresco approach to social engagement.
The bottom line: sometimes it’s good to stretch ones field of vision. The phrase “Customer Engagement Center” may or may not ever enter the common vocabulary of IT or business buyers. Already there have been many nay-sayers (life Hem, Haw, Sniff and Scuffy from Who Moved My Cheese) who think the change is too big, or that it is much ado about nothing. And that might just be. Life has a way of self-healing, and the new term may be scabbed over and gone. OR, something else might happen: organizations will start to see that the concept of customer engagement – the act of treating customers with intent, integrity, consistency and gaining their trust – is a winning ticket.
What do you think? Flash in the pan, or an idea with legs?
(Thank you for an amazing Customer360 Conference in San Diego – just great attendees, providing such great feedback and asking wonderful questions!! Off to London in a couple of weeks to see how our EU clients – see http://gtnr.it/181SMyp )
What Does Your Brand Stand For?
It may sound like some sort of high-minded existential question, but I’d argue that it’s the essence of everything you do as a modern marketer. The best marketers can answer it without flinching. They have a clear, unambiguous raison d’être that, as my colleagues Richard Fouts and Jennifer Beck say, resolves the ever-important and perhaps more visceral derivative of the same question: Why do you get out of bed in the morning?
To generate awareness, drive revenue, stock price or even to disrupt markets isn’t good enough. Why? Because you’re unlikely to do these things well without a clear understanding of what your brand stands for.
In today’s hypercompetitive markets, customers have a superabundance of choice—choice in how they allocate their time, attention, budgets and discretionary dollars. Brands that secure these scarce resources are the ones with empathy. They’re the opposite of egocentric. They’re intimately aware of whose lives they make better and in what specific ways. Their brands stand for ideals that resonate with audiences including but by absolutely no means limited to shareholders.
- What specific problems do we solve?
- Why are these problems worth solving?
- By doing so, whose lives do we make better?
- And by making their lives better, what impact are we having on the world?
- What issues do our customers care about and how do we advocate for them?
- How do we do all of this in ways that are appreciably better or more effective than our rivals?
By answering these questions, you’ve begun to unpack the essence of what your brand stands for. And once you do that, you’re better equipped to tell stories that audiences want to hear—and want to share.
What does your brand stand for?
ATM Heist points to fundamental business and technology issues in the payment systems
The recently disclosed $45 million ATM worldwide cashout heist (see bankinfosecurity.com ) points to many practical business and technology issues that payment system participants face.
Here are just a few of them:
a) One of the more troubling issues of these breaches is the difficulty in determining the points of the network chain that were breached by the fraudsters. This makes it very difficult for card issuers to recover their lost funds because they don’t know who is liable for the breach.
b) From conversations I’ve had with various issuer clients regarding recent breaches, the card brands (Visa and MasterCard) are often not has helpful in helping card issuers recover funds as the issuers would like them to be, perhaps because the card brands don’t know where to assign the liability.
c) Frankly, from a holistic viewpoint, companies that accept or process card payments are in a no-win situation when it comes to a breach. They can do their best and spend lots of money and time becoming PCI certified, but this gives them no safe harbor from penalties that are incurred if they are still breached. And the auditors (qualified security assessors) that certify these eventually breached companies as PCI compliant have BIG disclaimers in their contracts that they take NO responsibility if in fact their clients are breached.
d) There are so many parties in the payment chain that it is very difficult to assign blame in these types of breaches. For example, there can easily be seven roundtrip hops or more between an ATM cash disbursement request and the cash disbursement. The leakage can happen at any of those points or hops.
e) A point-the-finger and assign-blame approach is in the end, a dead-end approach and a lose-lose for all parties concerned. A win-win approach would be to strengthen the security of the card payment system through stronger user authentication and more secure media used to request payments or cash withdrawals (e.g. CHIP and PIN based on the EMV standard).
f) Until then, we will continue to try to keep a leaky insecure payment system secure. It reminds me of the little Dutch boy who stuck his finger in the dyke and successfully stopped the sea water from flooding his home town. He was successful because he stopped the leak when it was very small. I think we are too late when it comes to our global card payment systems. We probably need at the least, a major cyber-army, in this instance.
A Mobility Extravaganza at Catalyst San Diego
We have two tracks on mobile at Catalyst this year depending on your role and the stage of your organization in mobile planning. Both tracks run for the duration of the conference.
One of the tracks is "Making Mobile Work". This track addresses fundamental mobile infrastructure projects that I know you’re working on today, including:
- Mobile performance
- App development and testing
- Mobile device security and SSO
- Mobile device management (MDM)
The other track is "Making Employees Productive in a Cloud and Mobile World". For organizations that are a bit further along in setting up mobile infrastructure and are now thinking about what they want users to do on the devices, this is the track for you. These are the next set of mobile projects after you’ve addressed BYOD, MDM, application development frameworks, etc. Topics include:
- How iPads are being used in enterprises today
- How mobile usage can translate into productivity
- How organizations that use Microsoft Office products on the desktop can survive on non-Microsoft mobile devices (or whether you need a productivity suite at all)
- Mobile file sync
These are just highlights – there are many more sessions that you can peruse out on the Catalyst website. I hope to see you there!
Hello and welcome to my re-launched Gartner blog (please provide your own champagne and party poppers). This revamp promises more relevant and timely posts, updated bi-weekly (with the occasional guest post), focused on the issues of today and tomorrow that will affect global IT powerhouses, your business and us all as individuals.
With the latest insights on how global economics and market forces are challenging IT vendors and departments to act creatively, I hope you’ll find the new blog a valuable resource during your day’s reading. Please keep the discussions rolling by contributing with comments to help further debate and your own insights into how we can survive and thrive in these uncertain economic times.
Will Private Cloud Adoption Increase by 2015?
My latest research note is now available, titled Will Private Cloud Adoption Increase by 2015? (Gartner client access required).
Understandably (I hope!) I can only tell you so much, as we want you to access and read the entire article, but the premise is that x86 server virtualization will continue to be a key focus of data center activity and growth through at least 2015. No big surprises there. However, infrastructure vendors need to recognize that the journey to data center automation and private cloud is less certain.
One little teaser … David Coyle, my team manager, really like this line:
Most significantly, the adoption of private cloud should not be assumed to be a foregone conclusion resulting from high levels of virtualization.
The note draws upon primary research conducted in Brazil, China, India and USA involving over 500 respondents. Thanks to my colleague and co-author Matthew Cheung for his help in guiding that primary research.
PS – for those with client access to gartner.com, look out for a range of new usability enhancements coming your way. I have a sneak peak … and am loving it!
Source: xkcd, used under a Creative Commons license
Printable guns and the FUD factor
The web is all abuzz about the emergence of workable designs for the manufacture of simple guns via a 3d printer. Defense Distributed ( http://defdist.org/) is behind much of this furor as they have released CAD files that enable anyone with sufficient resources (an appropriate 3D printer, raw materials and basic mechanical skills) to print and assemble a functioning pistol. Already, various state legislatures are working on laws that seek to prohibit these activities (printing your own gun) and the US State Department has ordered Defense Distributed to take the files out of public distribution.
All of this hysteria is a wonderful example of bad risk management.
The basic logic presented by most of those that oppose the availability of 3d designs for printing guns appears to be that availability will enable (or possibly encourage) bad people to manufacture weapons that they will use to do bad things. Unfortunately, economic realities drive behavior in the opposite direction. Although I am confident that we will one day have relatively cheap 3d printers capable of printing high density plastics or even some metals, at present, the cost of the equipment and materials required to produce a functioning weapon are far in excess of the cost of a reliable, manufactured weapon from a reputable manufacturer. Even if you are willing to invest in 3d printing resources to produce a weapon, you will end up with a weapon that will probably fire less than 10 rounds before failing and that assumes that you printed it correctly, the designs were good and that you assembled correctly and tested it adequately. Along the way, you might encounter a few prototypes that fail catastrophically (who wants to take the first shot?).
So, what’s a bad guy going to do? Purchase a stack of technical gear and powders, get it all working, assemble the output, test it and then use it (until it fails after a few shots) or is the bad person going to acquire a reliable weapon online, at a gun show or on the street for a few hundred dollars? The economics are simple. The market for weapons is fairly efficient and cheap, reliable weapons are available with no requirement for upfront capital costs for manufacturing capabilities.
The actual risk presented by Defense Distributed’s designs is negligible when compared to all of the other ways that weapons can be produced and acquired. By focusing on 3d printing of weapons, legislators and regulators perpetuate the poor risk thinking that has resulted in the open derision of many TSA activities. A much greater risk is that poorly written regulations will inhibit the development and adoption of 3d printing techniques by manufacturers – both established and start-up, innovative organizations.
And how could such a regulation be enforced? Require manufacturers of 3d printers to get their machines to determine which designs are illegal and should not be printed? Would that mean that gun manufacturers would not be able to use 3d printers in their manufacturing process? Block the distribution of designs? In addition to the free speech issues this might generate, one thing we have learned in the security business is that it is nearly impossible to stop illegal material from circulating on networks.
Humans are great at imagining risks and taking steps to mitigate fictional threats. If the objective is risk management, a real risk analysis is needed to drive cost effective investment in mitigation. Flights of fancy are not part of risk management.
How to Determine your salesforce.com Org Strategy
An org in the salesforce.com vernacular is a logical instance of data and metadata for a set of users. Selecting single org, multi org or a combination remains a key decision that will improve the effectiveness of salesforce.com projects for which business application managers are responsible. In a research note I recently published, How to Manage Salesforce.com Orgs for Optimal Benefit, I provide a framework business application managers can use to determine the most optimial salesforce.com Org strategy for their organization
When looking at my customer inquiries in the past year, Salesforce.com org structures are in the top 3 questions I recieve for clients about to embark on a salesforce.com project. Salesforce.com project managers must determine how many salesforce.com orgs they should use in their projects. An org unit is bound by both capacity limits (number of users and storage) and execution computing resources (query sizes and API limits). The specific limitations will be determined by the specific salesforce.com offering and edition. Therefore, making an incorrect decsion will have impact on the success on a salesforce.com project.
There Is More To The US Open Data Policy Than Meets The Eye
On May 9, after a longer-than-expected preparation, the Open Data Policy announced as part of the US Digital Government Strategy has been issued together with an executive order signed by President Obama about Making Open and Machine Readable the New Default for Government Information.
As one reads the order, browses through the first few pages of the policy or watches the short video that CIO Steve Van Roekel and CTO Todd Park released to explain the policy, the first impression is that this is just the reinforcement of prior open government policies. The order is quite explicit in saying that (emphasis is mine):
The default state of new and modernized Government information resources shall be open and machine readable. Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable. In making this the new default state, executive departments and agencies (agencies) shall ensure that they safeguard individual privacy, confidentiality, and national security
Looking at the definition of open data in the policy itself, the first attribute for open data is being public, and then accessible, fully described, reusable, complete, timely and managed post-release. Therefore one might think that this policy is mostly about encouraging agencies to pursue what was started four years ago with the Open Government Directive and build on the success of the many initiatives that Todd Park has relentlessly pushed since when he became US CTO.
Even if this were the only focus of this policy, it would be a great accomplishment. The policy provides clarity on issues like the so-called “mosaic effect” (i.e. the risk that combining individual datasets may lead to identifying individuals).the need to prioritize data releases by engaging customers, the need to enforce privacy and confidentiality, and more. The policy also announces the establishment of a new resource called Project Open Data, which will be an online repository of tools, best practices and schema to help agencies.
But there is more, and this is where the policy gets really interesting. As the Scope section says,
The requirements in part III, sections 1 and 2 of this Memorandum apply to all new information collection, creation, and system development eff011s as well as major modernization projects that update or re-design existing information systems
Section 1 is about collecting or creating information in a way that supports downstream and dissemination activity, while section 2 is about building information systems to support interoperability and information accessibility. In the former, the policy asks agencies to “use machine readable and open formats for information as it is collected or created”. The latter suggests that “the system design must be scalable, flexible, and facilitate extraction of data in multiple formats and for a range of uses as internal and external needs change, including potential uses not accounted for in the original design”. Still in section 1 one can read “Agencies must apply open licenses, in consultation with the best practices found in Project Open Data, to information as it is collected or created so that if data are made public there are no restrictions on copying, publishing, distributing, transmitting, adapting, or otherwise using the information for non-commercial or for commercial purposes”.
The scope section also says that
The requirements in part III, section 3 apply to management of all datasets used in an agency’s information systems
Section 3 is about strengthening data management and release practices and says that “agency data assets are managed and maintained throughout their life cycle”, and “agencies must adopt effective data asset portfolio management approaches”. Agencies must develop an enterprise data inventory that accounts for datasets used in the agency’s information systems. “The inventory will indicate, as appropriate, if the agency has determined that the individual datasets may be made publicly available”
Now, let’s forget the first attribute of open data for a moment and let’s look at how this applies to any data, even non-public one. Most of what is said above still holds. The enterprise data inventory is for all data, machine-readable and open formats apply to all data, interoperability and information accessibility apply to all data. Some, maybe most data for some agencies will be public, but other will not, and yet the same fundamental principles that look at data as the most fundamental asset still apply.
A while ago I wrote about the concept of basic data that the Danish government had come up with, and more recently I have written a research note about the importance of data-centricity in government transformation (subscription required). This policy seems to go in the same direction.
While its packaging and external focus is mostly about open public data, and in this respects it further develops policies that we have seen a few years ago, its most disruptive implication is that the concept of “open by default” does apply to any data.
It would have been beneficial to make a clear distinction between “open data” and “open public data”, but I understand that the constituencies that push for transparency and openness would not welcome the distinction, assuming that this would give the government the ability to decide at leisure where to share and where to hide data.
Nonetheless, the policy can be read and used as a means to initiate a tidal shift in how data is used across government.Section 5 of the policy is about incorporating new interoperability and openness requirements into core agency processes. Information Resource Management (IRM) strategic plans must align to agency’s strategic plans and “provide a description of how IRM activities help accompanying agency missions”.
Finally the implementation section puts the CIO at the very center of this change, without calling – at least explicitly – for any new role (such as Chief Data Officer), and stresses that cost savings are expected and potential upfront investments should be considered in the context of their future benefits and be funded through the agency’s capital planning and budget processes”. Which is to say that openness is not a nice to have, for which additional financial support should be expected, but is at the core of how agencies should operate to be more effective and efficient.
As I am a cynical analyst, I can’t just be complimentary of an otherwise brilliant policy, without flagging one minor point where the policy might have been more explicit. In section 3.d.i the policy indicates the responsibility for “communicating the strategic value of open (public) data to internal stakeholders and the public”. This is great as selling open public data internally is absolutely fundamental to get support and make openness a sustainable practice. However I would have loved an explicit mention to the need for agencies to use and leverage each other’s open public data, rather than suggesting that the only target is “entrepreneurs and innovators in the private and nonprofit sector”.
Let’s be clear: there is nothing in the policy that would either prevent or discourage internal use of open public data. But as the policy gets implemented, the balance and collaboration between the CTO Todd Park – who will most likely continue pursuing the external impact of open public data – and the CIO Steve VanRoekel – who chairs the CIO Council and will be mostly concerned with the internal use of information – will be crucial to make sure that openness by default becomes the new mantra.
what we don't know about private cloud
This is not one market, it’s a hundred (or hundreds of) markets. There is no real big pattern to where the inflection points are. There are lots of little patterns.
We do not know:
- The line between what applications we will run and what we will totally outsource to SaaS
- The lines between what we will leave bare metal, what we will virtualize and what we will actually run in any cloud
- The line between what we will do on a public cloud and in a private cloud
- The degree, magnitude, and timescale of how “virtual private cloud” and “hosted private cloud” moves those lines
- The degree, magnitude, and timescale of how various approaches to cloudifying “legacy” apps via encapsulation, migration, replicating, etc., moves those lines
These lines are being drown in different places by everyone–including similar orgs by sector or size or whatever and also by different groups within the same org.
Add to that the range of things that are called “private cloud”–everything from “I have a data center with some servers in it” to “I’ve built my own EC2, EBS, S3, ELB, SQS, and SNS with open source software on commodity hardware and can automate ALL THE THINGS”.
Here’s something I do know: any number put forth for private cloud market size, growth, or spend is utterly daft.