The Dark Side of Cloud Computing

19.04.2013 by Craig Burton

When things go bad, it goes really bad

At KuppingerCole we use Office365 extensively to manage our documents and keep track of document development and distribution.

On April 9, 2013, Microsoft released a normal sized Tuesday update to Windows and Office products. The only thing is, this time the update completely broke the functionality of Office 365 and Office 2013. Trying to open a document stored in SharePoint would result in a recursive dialogue box asking for you to authenticate to the SharePoint server. Same thing would happen when trying to upload a document. Excel and PowerPoint documents had the same problem.

Going to the Office365 forum resulted in a bevy of customers complaining about the problem. A Microsoft tech support person was offering possible solutions, all of which were just time wasters and solved nothing.

“First, please run desktop setup by following Set up your desktop for Office 365
If the issue persists, please remove saved login credentials from the Windows credential manager and then sign into the MS account.
http://windows.microsoft.com/en-IN/windows7/Store-passwords-certificates-and-other-credentials-for-automatic-logon

Finally, two days later a customer posted a solution.

“KB2768349 is definitely the culprit. I uninstalled this on Windows RT and login worked again across all Office 2013 RT apps. Reinstalling broke it. Uninstalling again fixed it.
Replicated on my Windows 8 desktop with Office 2013.
For the time being I have hidden KB2768349 from Windows Update until this is fixed.”

As soon as I deleted the KB2768349 update the problem went away. I also learned what “hiding” an update entails.

For those of you dying to know, here is how you fix this thing.

control panel>windows update>view update history>installed updates

Scroll down thru the Office 2013 updates until you find KB2768349. Select and then uninstall.

Of course once you uninstall an update, it’s going to show back up again and try to update. The way you prevent this is to “hide” the update so it doesn’t keep showing up. To hide and update, you open Windows Update and right mouse the update you want to hide and select “hide update.” There you go.

So for two days the normal operation of Office365 was frustratingly broken. Now this was not just for me and my colleagues, but for everyone on the planet that used Office365 and installed these updates. At the same time, the fix applies to everyone on the planet using Office365 as well. In other words, critical apps in the cloud that go bad, go bad hard. They also heal big. Part of the deal.

I was surprised that I was the only one tweeting and complaining about it. I didn’t see one article or public view on this major screw up. The only place I saw any complaining was on the Office365 forum. So glad that was happening.


Cloud Computing and Standards

25.01.2013 by Craig Burton

Introduction

The three biggest trends impacting computing today are what I call the Computing Troika. Cloud Computing, Mobile Computing and Social Computing.

There is a fourth trend that is on par with each of the Troika movements. The API Economy.

Finally there is the question of the role of standards in these trends.

First, here is my definition of Cloud Computing—and its opposite—Non-cloud Computing.

Cloud Computing

Cloud Computing involves offering network computing services with the following three characteristics:

  1. IT Virtualization
  2. Multi-tenancy
  3. Service re-usability

IT Virtualization—Network services, including management and support, that are geographically independent. That is not to say that services are not on-premise, it just means that it doesn’t matter.

Multi-tenancy—Network services that offered to more than one tenant at a time.

Service re-usability—Network services that can be used and built upon for all tenants over and over.

All three are important. The one that needs explaining and is not so obvious is the Service Re-usability feature. AD FS (Active Directory Federated Services) integrated into WAAD (Windows Azure Active Directory) is a good example. Because it is virtual and multitenant, a single SAML instrumentation to WAAD gives permissioned SSO and integration to EVERY customer connected WAAD by default. This makes it highly leveragable and “reusable.” Further, all of the services to all tenants of WAAD have the same APIs, the same console UI, indeed, the same infrastructure from top to bottom. This lets IT departments be much more efficient and competitive.

But here is where the new rubber meets the road. WAAD is specifically designed to give the customer real freedom of choice. This is done by not trying to keep the customer captive with either architecture design or terms of service.

The architecture design moves from keeping the customer captive in a Silo with two main features.

  1. Standards support
  2. APIs for everything

Standards support is the traditional bailiwick for interoperability. Interoperability is the key feature of services that are vendor independent. But standards are not a panacea. Standards movement is slow. Further, it is a myth that standards compliance guarantees interoperability. One vendor’s standard floor is another vendor’s standard ceiling. To remain competitive, vendors tend to tweak the standard game—sometimes in excess—to maintain an advantage.

The new equalizer—for both the customer and the vendor—is the API Economy. By providing open simple API access to everything, a vendor can still differentiate and yet offer real freedom of choice to its services. With complete APIs infrastructure, services are no longer Silos. Any customer or competitor can duplicate or extend the Apps that use the services (such as an admin console, user portal or developer portal) without repercussion.

Non-cloud Computing

“Non-cloud” then becomes any architecture design that does not include all three of these features. Note that this puts even more importance on the API Economy. The IT computing silo prison can only be broken through an active API Economy. The key to the successful customer-centric product design is giving the customer Freedom of Choice. Freedom of Choice is not freedom of captor. Freedom of Choice must be vendor independent. Independence can only be gained thru the API Economy coupled with traditional standards process.

Standards

There is more than one type of standard.

The three main types are:

  1. de Facto
  2. de Jure
  3. de Riguer

De facto—is the Latin definition “by default”. TCP/IP is actually a de facto standard. It was declared by governments that the standard network protocol would be the de jure OSI standard. As we all know, OSI never happened. TCP/IP is the de facto standard of the internet.

De Jure—is the Latin definition “by jury” or committee. HTTP is a de jure standard by the WC3.

De Riguer—is French for standard by current fashion. De Riguer goes far beyond “fashion”. Both de facto and de jure standards are very slow moving. In fact, a de jure standard is—except for governments— obsolete and certainly out of fashion by the time the committee ratifies the standard.

I bring up the distinction of these three standards because I think that what we can expect is to see a rapid proliferation of de Riguer standards that are built on top both de Facto and de Jure standards that are highly usable and can be referred to and used as “standards” that can provide interoperability without either a laborious and expensive de Jure process or the expectation of an accidental de Facto crown.

For example, the use and creating of “Graph API” design and methods as we see in the Facebook and WAAD API design are going to become standards independently of any committee. Of course the thought of this kind of talk scares a lot of people to death because of the kind of crazy behavior we see from vendors like Twitter and what it has done to its developers and its API.

But it is my opinion that when vendors act in such irresponsible ways they do so at their own peril I believe in the long term that we can successfully lean on rational thought and behavior that will support a strong three standard ecosystem that works.


NSTIC Update

24.09.2012 by Craig Burton

National Institute of Standards and Technology awards $9M to support trusted identity initiative

Introduction

On September 20, 2012, the National Institute of Standards and Technology (NIST) announced more than 9 million USD dollars of grant awards in support of the National Strategy for Trusted Identities in Cyberspace (NSTIC).

The grants were awarded to five consortiums. All of the big. All of them representing different views and technologies with strong focus on identity, security, and trust.

NSTIC Background

While many identity and security professionals are familiar with the Obama administrations NSTIC program, many international professionals are not. In order to address all of KuppingerCole’s constituents, some background information is useful.

The impetus for the NSTIC policy move by the Obama Administration is part of the Cyberspace Policy Review published in June 2009. The administration appointed Howard Schmidtin a new Cyber Security Coordinator position. Schmidt is a well-known security expert and is experienced in international security policies and technologies.

On Tuesday, December 22, 2009, Schmidt was named as the United States’ top computer security advisor to President Barack Obama. Previously, Schmidt served as a cyber-adviser in President George W. Bush’s White House and has served as chief security strategist for the US CERT Partners Program for the National Cyber Security Division through Carnegie Mellon University, in support of the Department of Homeland Security. He has served as vice president and chief information security officer and chief security strategist for eBay.

Prior to joining the Obama Administration, Schmidt served as President of the Information Security Forum and President and CEO of R & H Security Consulting LLC, which he founded in May 2005.He was also the international president of the Information Systems Security Association and a board member of the Finnish security company Codenomicon, the American security company Fortify Software, and the International Information Systems Security Certification Consortium,commonly known as (ISC)². In October 2008 he was named one of the 50 most influential people in business IT by readers and editors of Baseline Magazine.

Source: Wikipedia

Under Schmidt’s direction and managed by NIST, the first draft of NSTIC was published in draft form in June of 2010. The draft received much criticism for the lack of privacy protection for individuals and the size of the role played by the government. A final draft was rewritten and published in May of 2011. In the final draft, the role of the government was reduced and privacy issues were addressed.

The stated objectives of the NSTIC initiative are:

NSTIC is a White House initiative to work collaboratively with the private sector, advocacy groups and public-sector agencies. The selected pilot proposals advance the NSTIC vision that individuals and organizations adopt secure, efficient, easy-to-use, and interoperable identity credentials to access online services in a way that promotes confidence, privacy, choice and innovation.
“Increasing confidence in online transactions fosters innovation and economic growth,” said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher. “These investments in the development of identity solutions will help protect our citizens from identity theft and other types of fraud, while helping our businesses, especially small businesses, reduce their costs.”
NSTIC envisions an “Identity Ecosystem” in which technologies, policies and consensus-based standards support greater trust and security when individuals, businesses and other organizations conduct sensitive transactions online.
The pilots span multiple sectors, including health care, online media, retail, banking, higher education, and state and local government and will test and demonstrate new solutions, models or frameworks that do not exist in the marketplace today.

The Announcement

As expected, NIST picked big consortiums with big ideas for identity and trust across a broad spectrum on technologies and market segments. Here is what the basics are about its choices for the consortiums:

“These five pilots take the vision and principles embodied in the NSTIC and translate them directly into solutions that will be deployed into the marketplace,” said Jeremy Grant, senior executive advisor for identity management and head of the NSTIC National Program Office, which is led by NIST. “By clearly aligning with core NSTIC guiding principles and directly addressing known barriers to the adoption of the Identity Ecosystem, the pilot projects will both promote innovation in online identity management and inform the important work of the Identity Ecosystem Steering Group.”

The grantees of the pilot awards are:

The American Association of Motor Vehicle Administrators (AAMVA) (Va.): $1,621,803
AAMVA will lead a consortium of private industry and government partners to implement and pilot the Cross Sector Digital Identity Initiative (CSDII). The goal of this initiative is to produce a secure online identity ecosystem that will lead to safer transactions by enhancing privacy and reducing the risk of fraud in online commerce. In addition to AAMVA, the CSDII pilot participants include the Commonwealth of Virginia Department of Motor Vehicles, Biometric Signature ID, CA Technologies, Microsoft and AT&T.
Criterion Systems (Va.): $1,977,732
The Criterion pilot will allow consumers to selectively share shopping and other preferences and information to both reduce fraud and enhance the user experience. It will enable convenient, secure and privacy-enhancing online transactions for consumers, including access to Web services from leading identity service providers; seller login to online auction services; access to financial services at Broadridge; improved supply chain management at General Electric; and first-response management at various government agencies and health care service providers. The Criterion team includes ID/DataWeb, AOL Corp., LexisNexis®, Risk Solutions, Experian, Ping Identity Corp., CA Technologies, PacificEast, Wave Systems Corp., Internet2 Consortium/In-Common Federation, and Fixmo Inc.
Daon, Inc. (Va.): $1,821,520
The Daon pilot will demonstrate how senior citizens and all consumers can benefit from a digitally connected, consumer friendly Identity Ecosystem that enables consistent, trusted interactions with multiple parties online that will reduce fraud and enhance privacy. The pilot will employ user-friendly identity solutions that leverage smart mobile devices (smartphones/tablets) to maximize consumer choice and usability. Pilot team members include AARP, PayPal, Purdue University, and the American Association of Airport Executives.
Resilient Network Systems, Inc. (Calif.): $1,999,371
The Resilient pilot seeks to demonstrate that sensitive health and education transactions on the Internet can earn patient and parent trust by using a Trust Network built around privacy-enhancing encryption technology to provide secure, multifactor, on-demand identity proofing and authentication across multiple sectors. Resilient will partner with the American Medical Association, Aetna, the American College of Cardiology, ActiveHealth Management, Medicity, LexisNexis, NaviNet, the San Diego Beacon eHealth Community, Gorge Health Connect, the Kantara Initiative, and the National eHealth Collaborative.
In the education sector, Resilient will demonstrate secure Family Educational Rights and Privacy Act (FERPA) and Children’s Online Privacy Protection Act (COPPA)-compliant access to online learning for children. Resilient will partner with the National Laboratory for Education Transformation, LexisNexis, Neustar, Knowledge Factor, Authentify Inc., Riverside Unified School District, Santa Cruz County Office of Education, and the Kantara Initiative to provide secure, but privacy-enhancing verification of children, parents, teachers and staff, as well as verification of parent-child relationships.
University Corporation for Advanced Internet Development (UCAID) (Mich.): $1,840,263
UCAID, known publicly as Internet2, intends to build a consistent and robust privacy infrastructure through common attributes; user-effective privacy managers; anonymous credentials; and Internet2′s InCommon Identity Federation service; and to encourage the use of multifactor authentication and other technologies. Internet2′s partners include the Carnegie Mellon and Brown University computer science departments, University of Texas, the Massachusetts Institute of Technology, and the University of Utah. The intent is for the research and education community to create tools to help individuals preserve privacy and a scalable privacy infrastructure that can serve a broader community, and add value to the nation’s identity ecosystem.

High Level Analysis

In terms of government initiatives, NSTIC has been moving at lightning speed. Jeremy Grant has been a proactive advocate of the initiative and is articulate and capable leader. It shows from the choices of these consortiums and their constituents.

At the same time—9 million dollars spread across five initiatives; each with many mouths to feed—does not go very far and can be used up very quickly. It will be interesting to see how far each will proceed over the next twelve months. I chose 12 months because I can’t see how the money awarded to each group will last much longer than that.

Each group will need to put a plan together and execute in that time frame if they are to survive.

Over the next short period, we will take a closer look at each initiative, what their respective architectures look like, and what the specific objectives are in their roles in the identity ecosystem outlined my NIST.

Of course, I will be paying special attention to what each consortium has planned as an API Economy strategy. Each will need to have a solid API design that gives all of the other groups API access to all of the services through both the Web Services legacy (SOAP) and the emerging API Economy imperative (RESTful).

If each group does not have a solid SOAP/RESTful API strategy, they simply will not succeed—either individually or as a whole.

I know it sounds strange coming from me that an organization should continue embracing the SOAP legacy, but there are just too many government and non-profit organizations that cannot afford to jump to the real world quickly and must continue carrying the burden of the past. So it is sometimes.

Of course there are many more issues involved with success of this initiative beyond APIs, these issues will be covered more in depth in subsequent KuppingerCole reports and activities at the EIC Conference in May 2013.

Nonetheless, we see this movement by the NIST of granting these award as positive and will have reverberating impact on the Identity community—across the glove—for the good for some time to come.


SAML is Dead! Long Live SAML!

19.09.2012 by Craig Burton

Answers to the unanswered questions from the webinar

Introduction

Last Friday on Sept. 14, Pamela Dingle—Sr. Technical Architect from Ping Identity Corp.—and I conducted a free webinar about the much ballyhooed demise of SAML.

You can view the webinar in its entirety on the KuppingerCole website.

To us, the best measurement of interest in any given webinar is the drop off rate. Just how many people drop off during the presentation? We were very pleased in the interest of the topic for the number of attendees and for that fact that no one dropped off from the presentation and Q&A.

However, we did not have the time to answer all of the questions presented. The following is a sequence of questions and answers that were left open.

It could be a little disorienting to read this Q&A if you didn’t attend the webinar, I recommend watching the webinar first to avoid any confusion or misunderstanding.

Webinar Questions and Answers

Q: Since the organizations are still not migrated entirely to API, i.e. still we have web browser based applications. So my question is instead of implementing different solutions one for browser based applications and one for API. Do you suggest a common way to support both the users? Thanks

A: Using APIs does not preclude using the browser to access the information and resources provided by the API. In fact, using the browser for API access is quite common. The sub context of this presentation is that it is not limited to the request-response browser model that we know and love for traditional applications. We are now moving beyond the model to an interactive model.

Q: As a follow up these companies could help us “leap frog” to newer protocols very quickly much like some countries skip the notion of “land line” because it’s easier to deploy cellular.

A: Great metaphor. Indeed the combination of RESTful API interface (HTTP), OAuth, JSON, UMA, SCIM, and webhooks are the technologies that I think are the leapfrog technologies.

Q: Many companies are outsourcing IT functions to outside providers, at what point do we just take this to the n-th degree and just let an org like Google or Apple handle identity for us? Is that too scary?

A: I think the answer lies in a simple question, is it the vendor that manages your identity your customer, or are you their customer. If the answer is the latter, it is very scary indeed. As long as we have the expectation of having Identity Management be free, and act as customers of the vendors that provide that service, they will be monetizing our identities to pay for the service. It will be up to the corporation or individual to choose which direction to take.

Q: What about devices not directly linked to people? I.e. do you have numbers that include the Internet of Things?

A: I tried to keep the numbers focused and understandable. Including inanimate and non-digitized items just increases the whole argument. Look for more info on numbers in future conversations.

Q: Have you considered the impact of the availability of global identities on the problem you sketched?

A: I don’t think the availability of a global identity reduces any of the issues in the arguments. Global identities—assuming it will ever happen—just compounds the problem.

Q: Ok, Craig, how do you deal w/ 2.8B identities – who numbers them? Who vets them? What fraud is possible? What is the metasystem – and does it really matter whether it is OAuth/SAML/OpenID?

A: This is a multipart question and I will answer them in turn. First off it is 28 billion and not 2.8. 1). Different organizations—both open and private—will number these entities. 2). Some of them will be vetted and some not. This becomes a big problem we are still grappling with, especially when no single Identity Provider can even be considered to be the validation resource for even a fraction of the entities we are talking about. Look for more information on Trust Frameworks to understand more on this topic. 3). Yes, fraud is possible. Fraud will always be an issue. It needs to be minimized. I think we are on an encouraging course to resolve these matters. 4). The only Metasystem proposed so far is the Identity Management as a Service architecture being designed by Kim Cameron at Microsoft in the form of Azure Active Directory. 5). Finally, in the end the protocols won’t matter just as the argument of CSMA vs Token Ring no longer matters. We will simply moved up the stack. It gets a little more complicated at this level because there are no more layers in the stack to move up to. This is all layer 7 stuff. Layer 7.5?

Q: Will you to be talking about this at IIW 15?

A: I am registered for IIW 15 and plan to attend. I will coordinate with Pamela to see if we can repeat this session during the conference.

Q: Just want to echo Pam’s point that the combinatorial explosion is over estimate. Not all users & devices will connect to all services. The real world ecosystems sees users congregate in niches.

A: I think the combinatorial explosion is an underestimate. Pam’s soft pedaling of the numbers are still staggering. If you recall, she thought that most organizations could look at the provisioning of devices in the 1000s or 10s of thousands. OK. To date, anything over 150 starts to create huge administrative overhead. This is not going to go away or be minimalized by downplaying what has already happened. 400M iOS devices alone. The numbers are staggering.

Conclusion

Thanks for the great questions and participation. I look forward to seeing people at IIW. I encourage anyone who attended this conference to attend IIW and the EIC next May in Munich.


Making Good on the Promise of IdMaaS

21.06.2012 by Craig Burton

As a follow up to Microsoft’s announcement of IdMaaS, the company announced the — to be soon delivered — developer preview for Windows Azure Active Directory (WAAD). As John Shewchuk puts it:

The developer preview, which will be available soon, builds on capabilities that Windows Azure Active Directory is already providing to customers. These include support for integration with consumer-oriented Internet identity providers such as Google and Facebook, and the ability to support Active Directory in deployments that span the cloud and enterprise through synchronization technology.

Together, the existing and new capabilities mean a developer can easily create applications that offer an experience that is connected with other directory-integrated applications. Users get SSO across third-party and Microsoft applications, and information such as organizational contacts, groups, and roles is shared across the applications. From an administrative perspective, Windows Azure Active Directory provides a foundation to manage the life cycle of identities and policy across applications.

In the Windows Azure Active Directory developer preview, we added a new way for applications to easily connect to the directory through the use of REST/HTTP interfaces.

An authorized application can operate on information in Windows Azure Active Directory through a URL such as:

https://directory.windows.net/contoso.com/Users(‘Ed@Contoso.com’)

Such a URL provides direct access to objects in the directory. For example, an HTTP GET to this URL will provide the following JSON response (abbreviated for readability):

{ “d”:  {
 "Manager": { "uri":"https://directory.windows.net/contoso.com/Users('User...')/Manager" },
 "MemberOf": { "uri":"https://directory.windows.net/contoso.com/Users('User...')/MemberOf" },
 "ObjectId": "90ef7131-9d01-4177-b5c6-fa2eb873ef19",
 "ObjectReference": "User_90ef7131-9d01-4177-b5c6-fa2eb873ef19",
 "ObjectType": "User",
 "AccountEnabled": true,
 "DisplayName": "Ed Blanton",
 "GivenName": "Ed",
 "Surname": "Blanton",
 "UserPrincipalName": Ed@contoso.com,
 "Mail": Ed@contoso.com,
 "JobTitle": "Vice President",
 "Department": "Operations",
 "TelephoneNumber": "4258828080",
 "Mobile": "2069417891",
 "StreetAddress": "One Main Street",
 "PhysicalDeliveryOfficeName": "Building 2",
 "City": "Redmond",
 "State": "WA",
 "Country": "US",
 "PostalCode": "98007" } 
}

Having a shared directory that enables this integration provides many benefits to developers, administrators, and users. If an application integrates with a shared directory just once—for one corporate customer, for example—in most respects no additional work needs to be done to have that integration apply to other organizations that use Windows Azure Active Directory. For an independent software vendor (ISV), this is a big change from the situation where each time a new customer acquires an application a custom integration needs to be done with the customer’s directory. With the addition of Facebook, Google, and the Microsoft account services, that one integration potentially brings a billion or more identities into the mix. The increase in the scope of applicability is profound. (Highlighting is mine).

Now that’s What I’m Talking About

There is still a lot to consider in what an IdMaaS system should actually do, but my position is that just the little bit of code reference shown here is a huge leap for usability and simplicity for all of us. I am very encouraged. This would be a major indicator that Microsoft is on the right leadership track to not only providing a specification for an industry design for IdMaaS, but also is on well on its way to delivering a product that will show us all how this is supposed to work.

Bravo!

The article goes on to make commitments on support for OAuth, Open ID Connect, and SAML/P. No mention of JSON Path support but I will get back to you about that. My guess is that if Microsoft is supporting JSON, JSON Path is also going to be supported. Otherwise it just wouldn’t make sense.

JSON and JSON Path

The API Economy is being fueled by the huge trend of accessibility of organization’s core competence through APIs. Almost all of the API development occurring in this trend are based of a RESTful API design with data being encoded in JSON (JavaScript Object Notation).  While JSON is not a new specification by any means, it is only in the last 5 years that JSON has emerged as the preferred — in lieu of XML — data format. We see this trend only becoming stronger.

JSON Path is to JSON what XPath is to XML. The following table shows a simple comparison between XPath and JSONPath.

xPath

JSONPath

Description

/

$

the root object/element
.

@

the current object/element
/

. or [ ]

child operator
..

n/a

parent operator
//

..

recursive descent
*

*

wildcard
@

n/a

attribute access. JSON structures don’t have attributes
[ ]

[ ]

subscript operator. XPath uses it to iterate over element collections and predicates. For JSON it is the native array operator.

|

[,]

Union operator in XPath results in a combination of node sets. JSONPath allows alternate names or array indices as a set.

n/a

[start:end:step] array slice operator

[ ]

?()

applies a filter (script) expression
n/a () script expression
() n/a grouping in XPath

Summary

As an industry, we are completely underwater in getting our arms around a workable — distributed and multi-centered identity management metasystem — that can even come close to addressing the issues that are already upon us. This includes the Consumerization of IT and its subsequent Identity explosion. Let alone the rise of the API Economy. No other vendor has come close to articulating a vision that can get us out of the predicament we are already in. There is no turning back.

Because of the lack leadership (the crew that killed off Information Cards)  in the past at Microsoft about its future in Identity Management, I had completely written Microsoft off as being relevant. I would have never expected Microsoft to gain its footing, do an about face, and head in the right direction. Clearly the new leadership has a vision that is ambitious and in alignment with what is needed. Shifting with this much spot on thinking in the time frame we are talking about (a little over 18 months) is tantamount to turning an aircraft carrier 180 degrees in a swimming pool.

I am stunned, pleased and can’t wait to see what happens next.

Reference Links

Identity Management as a Service — Original blog post by Kim Cameron

Reimagining Active Directory for the Social Enterprise (Part 1) — John Shewchuk’s post about Windows Azure Directory

Microsoft is Finally Being Relevant — My response to the announcement of IdMaaS

Reimagining Active Directory for the Social Enterprise (Part 2) — Shewchuk’s follow up post

The API Economy — KuppingerCole publication


LinkedIn Hacked—More Reason for IdM in the Cloud

07.06.2012 by Craig Burton

On June 6, 2012 LinkedIn was hacked and user accounts — names and passwords — were compromised.

linkedin-11369628

Follow LinkedIn’s advice on addressing the matter.

There are just two things I want to say about this.

1. Service Providers should build hardened systems up-front

Any service provider that has a security architecture that stores names and passwords on a server somewhere has an unacceptable system design.

There is simply NO excuse for letting this happen — EVER.

LinkedIn management is acting like hashing and salting passwords is some new thing that they are all over as a result of the compromise.

This is silly, hashing and salting should have happened in the first place, not as an afterthought.

2. One More Reason for IDMaaS

If LinkedIn was using IDMaaS for its Identity Management instead of its own “yet-another-funky-id-system” — it would not be standing knee deep in PR feces with egg on its face.

This is because the designers of IDMaaS services are specialized in Identity and security. That is all they do. No social connecting, no email or friending. Just managing Identity in the cloud.

Specialized core services for all of your systems design in the cloud fits in with best practices of good systems design. It is just too expensive and hard for every company to have the required expertise to design hardened systems in today’s IT environment.

Summary

The sooner we can start building on an Identity Metasystem design, the better.

Even scarier than LinkedIn being hacked — you can almost guarantee that many other cloud-based services you are using have a similar “yet-another-funky-id-system” design for IdM.

Scarier still — these systems probably won’t get fixed until they are compromised.


Security > 140 Conversation with Craig Burton

28.03.2012 by Craig Burton

I had a conversation with Gunnar Peterson recently. Here is the transcript of the exchange. It is short but worth looking at.

Today’s Security > 140 Conversation is with Craig Burton is a Distinguished Analyst at KuppingerCole, in his  recent work, Craig explores the API Economy and how participating in the API economy reconfigures organizations’ priorities.

Gunnar always asks insightful questions. I really enjoy his presentations each year at the Cloud Identity Summit. Not sure if I will be speaking this year or not.


How to Spot an Unnecessary Identity Fail

09.06.2011 by Craig Burton

I’ve been watching the recent announcements about how hackers—some speculate foreign countries—have cracked the security infrastructure of a system and have stolen the names and passwords of thousands—sometimes millions—of customers.

The details of all these disasters are not what I want to talk about. Just this simple and seemingly obvious point.

Any system that stores the names and passwords of anyone is a failed security design.

Symmetric vs. Asymmetric keys

In the late seventies, these three guys—Rivest, Shamir and Adleman (you probably know them as “RSA”)—published a paper describing a scheme for public-key cryptography.

They later formed a company based on this patented technology. Pretty much every systems company on the planet has ponied up and bought a license for some aspect of the technology.

If PKI is so good and so revolutionary to security design, why is this malicious theft of names and passwords happening?

I keep reading about how the RSA product line has been cracked and is not longer secure. We need to distinguish between the one time password product (SecureID and asymmetric cryptography.)

The bigger question for me is, why are there secrets that allow access being stored on the server in the first place.

Cryptographic protection can be implemented with symmetric keys or asymmetric keys. With the symmetric design, both the endpoint and the server keep copies of the keys. With an asymmetric design, the server NEVER sees or knows the keys. The key is only stored at the endpoint. To me, this is the main point for private and public key pairs in the first place.

With that knowledge in hand, one has to ask, “Why would anyone—including RSA’s SecureID product—design a system that uses symmetrical keys?”

Good question. Answer: Poor cryptographic implementation decisions.

So now you can always spot a failed identity design. Anytime the details of a security compromise includes the theft of user ID’s and passwords you can nod wisely and say—“Symmetric keys. What were they thinking?”

If you want to protect the names and passwords of your customers, an asymmetrical cryptography implementation is desirable.

By the way, just to stick it to whoever the idiot was at Microsoft that decided that the CardSpace design should be scrapped—CardSpace is the BEST security design at Microsoft that uses an asymmetric key design.

In hindsight, dumping CardSpace was clearly a political move, not a technical one.


Mono Resurrects Itself as Xamirin

19.05.2011 by Craig Burton

When I was deeply involved in technology and company acquisitions at Novell, I learned the hard way how difficult it is to merge disparate corporate cultures.

Money usually only helps a little.

Company after company acquired by Novell disappeared from the planet. Often times with disastrous results. It was only on occasion that an acquisition yielded any measurable benefit.

This is why I winced and expected the worst when Novell announced the acquisition of Ximian back in 2003. How Miguel de Icaza survived the Novell acquisition gauntlet is a mystery to me. When I read Attachmate fired all of the people working on the mono project a few weeks ago, I figured the axe had  finally fallen and that the Mono project was dead.

Not a good thing. It certainly speaks to the visionary skills of the new Suse management team. Mono was the ONLY innovative thing happening at Suse. Everything else is just playing catch up to Red Hat.

Even the language of the announcement sucked:

“We have re-established Nuremburg as the headquarters of our SUSE business unit and the prioritization and resourcing of certain development efforts – including Mono – will now be determined by the business unit leaders there,” said Jeff Hawn, Chairman and CEO of The Attachmate Group in a statement sent to InternetNews.com. “This change led to the release of some US based employees today. As previously stated, all technology roadmaps remain intact with resources being added to those in a manner commensurate with customer demand.”

To fully understand this announcement, a quick lesson on “vendor speak” is appropriate. When a vendor invokes anything that resembles “Our actions are based on ‘customer demand’” you know that you are being fed a line. It is what magicians refer to as “misdirection.” It is a form of deception in which the attention of the audience is focused one thing in order to distract its attention from another.

A vendor that states its future planning is based on customer demand is a vendor in cruise-mode with no budget or plan to do anything about the particular topic. Thus the interpretation of the vendor speak “…all technology roadmaps remain intact with resources being added to those in a manner commensurate with customer demand” is: “we have no logical explanation for this irrational behavior.” In other words, you’ve just been fed a line of bullshit.

Rising from the ashes
1211646419717Miguel-de-icaza-APdn (1)

Then I heard the welcome surprise, Miguel announced the formation of Xamarin. Unlike the bumbling headless Attachmate strategy, he nails a clearly articulated plan and vision for Xamarin.

“We believe strongly in splitting the presentation layer from the business logic in your application and supporting both your backend needs with C# on the server, the client or mobile devices and giving you the tools to use .NET languages in every desktop and mobile client.”

Yes!

I am so happy to see the Mono team emerge from 8 years of suppression and fighting for an incredibly visionary cause with no support, marketing budget or corporate sponsorship.

Well done Miguel. Breath easy, the worst part is over.

Novell is dead, but—thank the Gods of good code—the mono project lives on.


Bringing the Web to Life at Last

04.05.2011 by Craig Burton

It isn’t very often that an Internet principle comes along that is so important that it actually affects almost everyone and everything. The Live Web  is one of those Internet principles.

The Static Web — the Internet as we know it today — has no thread of knowing or context. Until now, there has not been enough infrastructure in existence for a computer to do the work of presenting the Internet in a context of purpose. The Live Web presents an infrastructure and architecture for automating context on the internet. The Live Web brings to life the notion of context automation.

Read the rest of this entry »


Services
© 2014 Craig Burton, KuppingerCole