16.07.2012 by Martin Kuppinger
Kim Cameron recently blogged about his view on SCIM and the Microsoft Graph API. Kim explains his view as to why SCIM and the Microsoft Graph API, which is related to the WAAS (Windows Azure Active Directory), are complementary. That reminded me of two older posts in my own blog:
Even while I didn’t focus explicitly on relationships in the second post but more on the management of entitlements, there is much about relationships in there implicitly. And when looking back at the concept of system.identity (which, from what I see, influenced WAAD and the Microsoft Graph API) it is also about a concept which is much more about dealing with relationships and the ability to model a more complex reality than simply access protocols via LDAP.
SCIM as of now has become widely accepted as a concept and has some likeliness not only to become a formal standard but a real one – one that is widely accepted and implemented. However it appears it will remain a narrowly focused standard (to avoid the term “limited”) which addresses a specific problem. That is fine.
The Microsoft Graph API on the other hand addresses a much broader scope, however focused (as of now) on WAAD. As Kim explains in his post, there is a need for a “multidimensional” protocol like the Graph API which allows dealing with an identity and its relationships.
Like Kim, I see both approaches as complementary, not competitive. You should be able to do what SCIM does with an approach like the Graph API (one that is standardized and supported by many vendors). But that isn’t the core target of the Graph API and the concept behind it. So for the fairly simple use cases of SCIM, SCIM appears to be the solution of choice. For many requirements around dealing with information about identities and their relationships, the Graph API (and maybe a standardized successor in the future) will do the job.
There is though, in this discussion, another point which should be considered: RESTful APIs and JSON are far easier to handle than “traditional” approaches in programming. The evolution of what we call the Open API Economy – you might have a look at Craig Burton’s blog and the KuppingerCole report on this topic written by Craig or the video from the EIC session on that topic – shows that the acceptance of such relatively simple interfaces is rapidly growing. So the need for having only one standard for everything diminishes. There is no doubt that we need standards. But standards are also limitations – LDAP is one example for a limited and limiting standard. I know too many cases where LDAP just wasn’t (and isn’t) sufficient for the business’ needs. Notably it never intended to serve many of these use cases. It was built as an expedient, worked well and then suffered as folks tried to pile on grossly inappropriate functions.
So my recommendation is not to artificially create a “battle of standards”. That isn’t of any value. Having a standardized Graph API in addition to SCIM (and maybe some other “lightweight” standards for specific use cases like a next-generation standard interface to XACML fully supporting RESTful APIs and JSON) makes much more sense to me. Even while I think that the name “Graph API” isn’t well chosen (you need to associate the term “graph” with the “graph theory” instead of the “graph” as in “diagram” or “chart” – so it’s more for the geeks), the concept makes a lot of sense. And SCIM (despite my critics) also has a lot of value in itself.
29.06.2012 by Martin Kuppinger
This week Jackson Shaw commented in his blog on an article written by John Fontana. The discussion is about the future of passwords and how federation and structures with IdPs (Identity Providers) will help us to avoid them. Both have somewhat different opinions. However, in both posts there is the idea of having an IdP, using federation, and getting rid of passwords.
My perspective is a little different and I’d like to add two important points (even while I think that Jackson is right with his skepticism regarding a quick replacement of passwords and with highlighting that password management solutions will be required by many organizations):
1) There is the need to authenticate to the IdP
2) We won’t rely on a single IdP in future
For the first point you might argue that you can use stronger authentication mechanisms relying on more than one factor (i.e. more than knowledge, more than what you know) when relying on a central IdP or a very few trusted identity providers whose business it is to secure everything. But:
- We’ve learned that nothing is fully secure – think about the RSA incident last year and many other incidents.
- We’ve learned that companies earning money for providing a high level of security do not necessarily care much about their own security – think about the DigiNotar scandal.
- We know that it is pretty complex and costly rolling out hardware-based two- or more-factor authentication – and pure software-based approaches have a tendency to be more limited regarding security. Approaches which require specialized hardware are unlikely to succeed, so until NFC (near field communication, with its own security issues) or security technology built into chipsets becomes standard, we will struggle with this.
- We also know that user acceptance is key to success – and many of the strong authentication approaches just fail here, like virtually all types of biometrics.
So passwords might still be the approach used by the central IdPs, leaving us in the “sad world of passwords” mentioned in these posts.
Even if we can solve these issues, there is the second one. The future will not see us relying on a single IdP. The future is about having multiple IdPs, even different IdPs for a single transaction and for different claims within one authorization request. I’ve touched on this topic repeatedly in my recent posts.
So many approaches like NSTIC are or might be limited in that they are country specific. NSTIC is driven by a US agency. It shall be usable globally. But will it be accepted globally? Or will others decide for somewhat different approaches? LinkedIn and all the others mentioned by John Fontana can’t rely on anything which is focused on one country. They will have to support IdPs from all over the world, with different implementations and different levels of identity assurance. And that is true for everyone relying on the concept of IdPs and related SPs (Service Providers) or relying parties. Trust frameworks will be dealing with the complexity of having many IdPs (By the way: the Personal Data Stores John Fontana points to won’t help – Life Management Platforms, as described here and in the related report at least provide some support but not directly for authentication). Clearly nothing of this is about a world without passwords. That might happen when you rely for some interactions and transactions only on IdPs, which assure the authentication based on n-factor-authentication with n>1 and not being pure knowledge. But it isn’t essential for any of these concepts – and as I stated: The fact that we will use more IdPs, moving away from an approach with “the central IdP everything relies on”, we will see a mix of authentication mechanisms.
You could “chain” such IdPs and rely on one strong, primary authentication. But then you are back at defining who that could be, setting up the entire “backend part” (between IdPs) of the Trust Framework, and by having one who is trusted by all AND accepted by the user.
So my view is that there is little hope for a world without passwords within a foreseeable period of time. We might reduce the number of passwords; we might use stronger technologies in many situations. But relying only on federation and Trust Frameworks and so on is not about getting rid of passwords.
25.04.2012 by Martin Kuppinger
During my Opening Keynote at this year’s EIC (European Identity & Cloud Conference, www.id-conf.com), when talking about the Top Trends in IAM, Mobile Security, GRC, and Cloud Computing I used the term “Identity Explosion” to describe the trend that organizations will continue (or start) to re-define their IAM infrastructures in order to make them future-proof. I talked more about that in my presentation on “Re-engineering IAM to better serve your business’ needs” later during the conference. Interestingly, I heard the term “Identity Explosion” being used several times in other sessions after that, referring to my keynote.
So today I want to look at that buzzword, at what’s behind the buzzword, and the impact of this “Identity Explosion”. When looking at IAM (Identity and Access Management), it’s about managing users and their access. However, most of the IAM infrastructures in place today were mainly built with the employee in mind. Even today I frequently observe in advisories that projects begin by starting with a focus on some (relatively) small groups of users, like the employees, some temporary workers, or maybe some of the business partners. However, the reality of many organizations is that they have – to use a real-world number – perhaps 28,000 employees and 4.5 million customers to deal with.
Thus one of the initial discussions in such advisories is always about ensuring that the scope is set wide enough: It is about looking at all potential types of users, at least during the conceptual phase. Organizations might start implementing for the internals, followed by business partners, and then the customers (and leads and prospects and suspects). But the design has to have the “Identity Explosion” in mind: This massive growth in the number of of identities to deal with. That starts with simple things like the structure of identifiers and ends with scalability issues and the integration of different technical approaches, for example versatile, risk- and context-aware authentication and authorization. I’ve seen companies struggling with the identifiers they have chosen only with employees in mind spending a lot of money to fix that.
But it is not only – and not even mainly – about the costs. It is about agility. If IT is not prepared to deal with all types of users and provide identity and security services for them, then IT will fail in supporting the business demands. These are about integration with partners and a tight interaction with the customers (and leads and so on). IT has to be prepared for that. It has to understand that there will be this “Identity Explosion” anyway, with a massively growing number of identities to deal with.
An interesting aspect which isn’t yet discussed much in this context is business policies, including segregation of duties. How do you deal with the situation in which the same person (e.g. you or me) could have at the same point in time the identity of a customer, freelance broker, and employee of the same insurance company? Three identities which have to be understood and managed: The same person might sell an insurance contract to himself and approve it, using three different identities.
And what I’ve discussed so far is just a small bang. The big bang is about the “Internet of Things”, at least for many organizations. An automotive vendor has to deal not only with his customers, dealers, employees, and suppliers. He also has to deal with the cars themselves, which again split up into many devices with their own “identity”. This again will increase the number of identities to deal with.
Having the “Identity Explosion” in mind when working on strategies, concepts, and implementation of IAM and all the related technologies helps avoid solutions which can’t scale with the changing business requirements. Thus looking at your current IAM and thinking about how to get ready for that is one of the things you should start doing now.
17.08.2011 by Martin Kuppinger
During the last years, there has been a lot of change in the Identity Provisioning market. Sun became part of Oracle, Novell is now NetIQ, BMC Control-SA is now at SailPoint, Völcker has been acquired by Quest, Siemens DirX ended up at Atos. These changes as well as other influencing factors like mergers & acquistions, failed projects, and so on lead to situations where customers start thinking about what to do next in IAM and around provisioning. Another factor is that sometimes provisioning solutions are implemented with focus on specific environments – SAP NetWeaver Identity Management for the SAP ecosystem, Microsoft FIM for the Active Directory world. Not that they only support this, but they might be just another provisioning system. In addition, especially in large organizations it is not uncommon that regional organizations start their own IAM projects. The result: There are many situations in which organizations think about what to do next in provisioning.
However, just moving from product A to product B is not the best approach. In most cases, the deployment of provisioning tools took quite a while. In many cases there have been lot of customizations been made. And even while there might be some uncertainty about the future of the one or other product (or, in some cases, the certainty that the product will be discontinued sometimes in the future), just migrating from one provisioning tool to another seems to be quite expensive for little added value.
From my perspective, it is important for organizations to move at their own pace. The approach to do that is to put a layer on top of provisioning systems. I’ve described several options in a research note (and some webinars) quite a while ago. The research note called “Access Governance Architectures” describes different approaches for layered architectures on top of provisioning products. I’ll write an update later this year but the current version illustrates the basic principle well. By adding a layer on top of provisioning, which might be Access Governance, a Portal/BPM layer, or IT Service Management (or a mix), organizations can deal with more than one provisioning tool. The architecture is more complex than just using one provisioning tool. But if you are not able to rely on one provisioning tool only, its at least an approach that works.
Organizations then can for example replace provisioning tools fully or partially. The latter is quite common if complex customizations have been made for selected target systems. Organizations can deal with multiple provisioning systems that “just appeared” for some reason - M+A, specific solutions for a specific part of the IT ecosystem, or whatever. And they can move forward more flexible than in a monolithic architecture. Yes, these approaches require some more architectural work at the beginning, but that pays off. It pays off by more flexible migrations, by avoiding migrations at all, by less “political” conflicts with some of the lobbies within IT. It even enables to change the integration layer without affecting the underlying provisioning systems. And for sure it allows to interface with target systems in a flexible way, not only using provisioning tools but service desks or other types of connectivity if required.
But, at the end, the most important thing is that it allows customers to move forward at their own pace. Thus, before you think about migrating away from your current provisioning tool, think about how you can save your investments and add value – by new functionality and by business-centric interfaces of Access Governance and the increased flexibility of your IAM environment.
10.08.2011 by Martin Kuppinger
Is there a mismatch between the reality in organizations and the implementations of at least several of the Identity Provisioning and Access Governance solutions when it comes to the representation of physical persons in IT? To me it appears that there is a mismatch.
The reality in all large organizations I know is that the real world is sort of 3-tiered:
- There is a physical person – let’s call him Mr. X
- Mr. X can act in very different contexts. You might call them roles or digital identities, however all of these terms are overloaded with meanings. I’ll give three examples for that. 1. Mr. X might be an employee of an insurance company and a freelance insurance broker for the insurance company at the same time. 2. Mr. X might be an employee of a bank and a customer. 3. Mr. X might be the managing director of both company ABC, Inc. and DEF, Inc., which both are owned by XYZ, Ltd where he is employed as well.
- In each of these contexts, Mr. X might have more than one account. If he acts as external freelance insurance broker or customer, that might only be one account. If he is the managing director of some corporations within a group, he might have different Active Directory accounts, different RACF accounts, different SAP accounts, and so on.
You might argue that these are exceptions. However, being a customer of the employing company isn’t an exception in many organizations. And, by the way: A good and valid model has to support not only a standard approach but the exceptions as well. With other words: There are few situations in which a real-world model isn’t 3-tiered.
And there are good reasons to model the systems according to that. If someone is a customer of a bank and an employee, there are very obvious SoD rules which apply to this. He shouldn’t give loans to himself. If someone is a freelance insurance broker and an employee of the insurance, the same is true. He shouldn’t manage the insurance contracts he is selling. If someone is a customer and and employee, it’s the same again. He shouldn’t give discounts, grant consumer loans, or just change the delivery queue.
However, several tools follow a 2-tiered approach. They know for example an “identity” and “accounts” or “users” which are associated with the identity. If someone has more than one such identity, the problems begin. In some cases, it is very easy to adopt the object model. In others, you have to find workarounds like mapping the unique IDs of these identities into the other identities, which then might require a lot of additional code and is error-prone.
From my perspective, supporting a 3-tiered model out-of-the-box, with
- Context, Identities,… (whatever term you prefer)
- Users (in specific systems), accounts,… (again – choose your term)
is mandatory to reflect the reality in organizations and to support the business requirements – especially when it comes to SoD policies. If you don’t need three tiers, it is easy to just use two of them. But if your tool supports only two tiers out-of-the-box, it might become a tough task to implement your real-world model. Looking at that point is, from my perspective, one of the most critical aspects when it comes to technology decisions.