From preventive to detective and corrective IAM

08.08.2014 by Martin Kuppinger

Controls in security and GRC (Governance, Risk Management, and Compliance) systems are commonly structured in preventive, detective, and reactive controls. When we look at IAM/IAG (Identity and Access Management/Governance), we can observe a journey from the initial focus on preventive controls towards increasingly advanced detective and corrective controls.

Initially IAM started with a preventive focus. This is done by managing users and access controls in target systems. Setting these entitlement rights prevents users from performing activities they should not perform. Unfortunately, this rarely works perfectly. A common example is access entitlements that are granted but not revoked.

With the introduction of Access Governance capabilities, some forms of detective controls were introduced. Access recertification focuses on detecting incorrect entitlements. The initial “access warehouse” concept as well as various reports also provided insight into these details. Today’s more advanced Access Intelligence and Access Risk Management solutions also focus on detecting issues.

Some vendors have already added integration with User Activity Monitoring (e.g. CA Technologies), SIEM (e.g. NetIQ), or Threat Detection Systems (e.g. IBM, CyberArk). These approaches move detection from a deferred approach towards near-time or real-time detection. If unusual activity is detected, alerts can be raised.

The next logical step will be corrective IAM – an IAM that automatically reacts by changing the settings of preventive controls. Once unusual activity is detected, actions are put in place automatically. The challenge therein is obvious: how to avoid interrupting the business in the case of “false positives”? And how to react adequately on “false positives”, without over-reacting?

In fact, corrective IAM will require moving action plans that today are in drawers (best case) or just in the mind of some experts (worst case) into defined actions, configured in IAM systems.

However, with the tightening threat landscape, with the knowledge that the attacker already might be inside the system, and with the IAM covering not only employees and internal systems, but business partners, customers, and the Cloud, IAM has to become far more responsive. IAM needs to become not only “real-time detective”, but also needs to have corrective controls put in place. This will be the next logical step in the evolution of IAM, which started way back with preventive controls.


Posted in IAM vision | 1 comment

Do we need to kill IAM to save it?

28.02.2013 by Martin Kuppinger

Last week I received a newsletter from Radiant Logic, a vendor of Virtual Directory Services and some other IAM stuff like Federation Services. This newsletter pointed to a video of a presentation of Gartner analyst Ian Glazer titled “Killing Identity Management in Order to Save it,” which had been published on February 7th, 2013.

In this video he spends a lot of time talking about some topics like

  • IAM is too static and typically HR driven
  • IAM is not focused on providing services and integrating with business applications
  • IAM is based on LDAP (and CSV) and other hierarchical approaches
  • 2013 will be the year of Identity Standards, especially OAuth, OpenID connect, and SCIM
  • Identity Service like those provided by Salesforce.com

When I read the newsletter of Radiant Logic – which take a fairly different view than Ian Glazer – and listened to the webinar, I started looking for some of the stuff my colleagues and me have written about this.

There is for example an article at our website talking about the fact that HR should not be the only leading system for IAM – the article dates back to 2007 (and is available in German only). And there are more, which are about things like the Identity Explosion and the need to deal with far more users.

I found several articles for example from back in 2008 looking at Identity Services and there were webinars and reports around that topic years ago. Some vendors have been doing integration of Identity Services into business applications, Oracle for example, for years now.

The end of LDAP in its current state was the topic of a blog post back in 2010 and I started discussing this with advisory customers at the same time.

Oh yes, clearly the standards mentioned will become more important this year. My colleague Craig Burton has described this on several occasions, including the KuppingerCole Scenario “The Future of Authentication”. And last year’s EIC hosted a workshop talking about the relevance of all these upcoming standards.

All the topics around identity services hosted by Salesforce.com or Microsoft’s upcoming Windows Azure Active Directory have also been a frequent topic in Craig’s blog posts and in some of our research notes.

There is nothing wrong with these theses. However, there is also not that much new in them.

Below the link to the video of the Ian Glazer presentation, there is the following claim:

The way the industry does identity management cannot incrementally improve to me [sic] future (and current) needs. I believe IAM must be killed off and reborn.

Given the fact I do a lot of advisory work besides research, like all the KuppingerCole analysts, I really struggle with this claim. There is no doubt about the fact that we need to “extend and embrace” what we are doing traditionally in IAM. It is about more than Identity Provisioning. Topics like versatile and context-/risk-based authentication and authorization, together with Identity Federation, are moving towards the center of attention – not only for core IAM challenges. We need to understand that there are new challenges imposed by the Computing Troika and that traditional approaches will not solve these.

However, I do not believe in disruptiveness. I believe in approaches that build on existing investments. IAM has to change, no doubt about that. But there will still be a lot of “old school” IAM together with the “new school” parts. Time and time again it has been proven that change without a migration path is an invitation to disaster. Embrace and extend is the classical migration methodology for classical technical transformative strategies.

I plan to do a session on this topic at EIC 2013 – don’t miss it if you want to save your investments and spend your budgets targeted to meet today’s and tomorrow’s challenges in IAM.


SailPoint and BMC – how to move forward?

14.06.2011 by Martin Kuppinger

There has been a lot of FUD (Fear, Uncertainty, Doubt) regarding Control-SA. The product has been moved from BMC to SailPoint in spring 2011. But communication about the impact for customers has been weak (to use a positive term…). After several talks with both SailPoint and BMC I’d like to provide some information. First of all, SailPoint now owns Control-SA, including the support team and other related human resources. There even is a roadmap for Control-SA and support for the newer releases (ESS 7.5.x) will be provided for several years from now.

On the other hand, SailPoint IdentityIQ now is on the price list of BMC. It can be bought with BMC contracts, BMC 1st level support, and so on. It is the strategic solution for Access Governance and Identity/Access Management offered by BMC. BMC itself only focuses on BIRM (BMC Identity Request Management), not to be mixed up with BRIM (BMC Remedy Identity Management), which is no longer sold through BMC (but the relevant parts are either BIRM or SailPoint products (ex Control-SA) now.

SailPoint will soon provide its own provisioning engine, which is sort of a lightweight implementation, being controlled by the Access Governance (and Lifecycle Management) components of IdentityIQ and which uses the existing connectors of Control-SA. SailPoint additionally plans to release new connectors.

This gives customers a lot of choices to move forward. They can use Control-SA for quite a while, at least if they use ESS 7.5.x and higher. They might move to the SailPoint provisioning engine, using IdentityIQ on top and the existing connectors. They might migrate to other provisioning tools, and so on. But the most important thing is: Control-SA isn’t dead and customers can take their time to consider their options. And my advice is: take your time and think about how your IAM, Access Governance, and Service Request Management should look like in the future.

I’ve written a research note on “Access Governance Architectures” some 15 months ago. I talk about different architectural approaches for Access Governance – and many of them are relevant when rethinking your strategy and architecture around the three topics mentioned above. The most important point is: it is not about having exactly one central provisioning tool anymore. Provisioning tools are an important element, but a lot of companies struggle with standardizing on one tool. There might be tools in use for quite a while for specific environments, sometimes with a lot of customization – think about mainframe connectors. There are mergers and acquisitions, bringing in new tools. There are lobbies pushing specific solutions for the Microsoft Active Directory environment or the SAP infrastructure. There might be too complex IT infrastructures in large organizations, divided across many organization divisions.

That’s were integrating layers like Access Governance and/or Service Request Management come into play. They might become the glue for different provisioning systems. And they even enable you to easier make changes at the provisioning layer. Modular architectures are somewhat more complex architecture-wise and from the integration perspective, but they provide you more flexibility for changes.

Looking at Control-SA environments, putting such a layer on top (which might be Sailpoint IdentityIQ but could be another Access Governance tool, SRM tool, or portal as well) allows you to migrate Control-SA at your own pace to whatever you want – or to add other provisioning tools if required. This provides you the flexibility. And in most cases it is the better choice than just replacing one monolith with another one. By the way: that is true for all the other provisioning systems, which might have to be migrated at some point of time as well.

Thus: evaluate your options first. Build a future-proof architecture (as future-proof as one could be based on what is there today). Then decide on what to do with Control-SA when. This will give you more time for your decisions and you most likely will end up with a better solution. If you then end up with a pure SailPoint or a mixed SailPoint/BMC (BIRM) solution or with a mixed vendor solution or a solution purely provided by another vendor, depends on your requirements. But it should be a well-thought decision, not something done in a hurry.


SCIM – will SPML shortcomings be reinvented?

23.04.2011 by Martin Kuppinger

There is a new initiative driven by Google, salesforce.com, and Ping Identity called SCIM (Simple Cloud Identity Management). It claims to overcome the shortcomings of SPML (Simple Provisioning Markup Language), a standard being around for some 10 years. SPML has the target of being a standard for provisioning information between systems. It is supported by most provisioning and access governance tools, but only few target systems. SAP probably is the most important supporter.

Google, salesforce.com, and others in the cloud don’t support SPML. Thus, provisioning to these systems requires using proprietary APIs, if available at all – Google and salesforce.com provide such APIs, but not every cloud provider does. To overcome this, work on SCIM has started.

The first question however is: Why not use SPML? The main reason might be that SPML is XML-based, not focusing on REST which appears to be the somewhat more efficient (and especially, more accepted) way to implement standards for the cloud. Another might be that SPML is moving forward very slowly, if moving at all. There are many defencencies in SPML, no doubt about that. These start with the limited support by non-IAM-vendors. There are technical limitations as well, including performance issues in large scale deployments and limitations regarding what could be provisioned via SPML.

Nevertheless, I’d like to ask two questions:

  • Wouldn’t it be better to join forces of SPML and SCIM to build a SPML version 3.0 which supports REST as well?
  • If working on a new or improved standard, wouldn’t it make sense to address all relevant use cases? SPML doesn’t today and SCIM is not likely to do, when looking at the information provided today.

The first aspect seems to be more sort of a political issue between different vendors. However, having two standards doesn’t help anyone at the end of the day.

That’s even more true if both standards are too lightweight and don’t cover all the companies need today. When looking at the little piece of SCIM specification published it looks like SCIM will only touch the surface of what is required. The use cases are focused on providing user information to cloud services. However, the topic isn’t identity management, it is identity and access management. The access or entitlement part is the big thing to solve. Dealing with different APIs of different cloud providers for identities is an issue, but it isn’t the biggest one – several vendors (federation, classical on-premise provisioning, cloud provisioning) have addressed this at least for the leading cloud providers.

But what about controlling who is allowed to do what in these services? How to manage entitlements, e.g. group membership, authorization rules, and other things? XACML is a standard which supports this, but again there is little to no support by cloud providers for XACML – like with SPML. Thus, when starting to define a new standard, it shouldn’t be a too simple one, which SCIM appears to be at that point of time. It has one which covers all relevant use cases of identity and access management. There is only limited value in providing user information to a cloud service but still having to enter the proprietary web administration interface (or using some proprietary APIs) to control access for that user, to define groups, roles, policies, and so on.

My conclusion: There should be open standards for identity and access management in the cloud. Building on proprietary services is about repeating errors made before. But a new standard shouldn’t be too limited from the beginning. That, by the way, is one of the reasons I see behind the very limited success of SPML: It was too limited. I remember a conversation with one of the leading people involved in SPML years back from now where I suggested looking at use cases like supporting client lifecycle management solutions, e.g. tools supporting (amongst other features) software deployments. There are vendors today in the client lifecycle management market building custom integrations to HR or provisioning tools today, but not based on SPML – because they have never heard about SPML and because SPML never looked at this use case.

There might be a good reason for an effort like SCIM. But just being a REST-based standard but not really thinking beyond what SPML supported won’t solve the real world problems. Thus I strongly recommend to rethink SCIM and to look at significantly extended use cases.

If someone likes to discuss this with me in person, best is to meet at at EIC in Munich, May 10th to 13th.


Why you should focus on the infrastructure layer

21.04.2011 by Martin Kuppinger

In these days of slowly increasing maturity of Cloud Computing it becomes more and more obvious that and why IT depends on a well thought layer which I tend to simply call “infrastructure”. I have two simple pictures of IT in mind:

  • The somewhat classical model of platform, infrastructure, and software, like found in PaaS, IaaS, and SaaS in the common Cloud Computing meta models. It’s about hardware and other foundational components like operating systems, about the layer between to manage and orchestrate everything, and the applications themselves.
  • Another view consists as well of three layers. The services exposed to the users (i.e. in most cases the business) on top, the service production (either in the public cloud or a private cloud or in non-cloudified IT environments) at the bottom – and a layer in between which again is used for managing and orchestrating everything. Again, this layer might best be called “infrastructure”.

This layer is which connects everything. Thus, efficiency and effectivity of this layers are the foundation of efficiency and effectivity of the entire IT. Optimizing this layer allows to better connect the available services to the business demands. It allows to manage the different layers in the cloud.

When looking at that layer, there are some few key elements:

  • Service Management, e.g. the entire area of procurement, service request management, accounting, availability, performance, and whatever it requires to ensure that the services are delivered as expected
  • Information Security Management, including IAM (Identity and Access Management) and at least IT GRC (Governance, Risk Management, Compliance)
  • Application Infrastructures, e.g. middleware allowing to connect services, to enhance them if required and to do the orchestration

Did I miss important elements? OK, there is the classical IT security, however that’s part of Information Security – the reason we are looking at IT security is to protect information. You might add some other elements, however I tend to keep this model simple.

To me it appears to be more important to look at the dependencies of the three services. Information Security and Service Management have to work hand in hand, to ensure that access to services is restricted and controlled. Applications and Information Security are tightly related – think about how to build secure apps. And applications are, at the end of the day, nothing else than services which have to be managed.

I personally believe that starting with such a model and outlining the blueprint for your future IT definitely helps in separating the important from the less important things and to focus on building an IT ecosystem in your organization which is stable and works with whatever you plan to do in the Cloud.

See you at EIC 2011 in Munich, May 10th to 13th.


Lessons enterprises should learn from the recent wiki-leak

17.12.2010 by Martin Kuppinger

There has been a lot of discussion around Wikileaks publishing an incredible amount of data which has been classified as confidential by the US Government. I don’t want to discuss this from specifically – many people have done this before, with fundamentally different conclusions. More interesting is what this means for private organizations, especially enterprises. Wikileaks has threatened some of them: The russian oligopolies, the finance industry in general. That comes to no surprise. Wikileaks founder Assange rates them as “bad”,e.g. his enemies. Given that Wikileaks isn’t alone out there, there is an obvious threat to any enterprise. Some might think that construction plans of the defense industry should be published. Others might think that should be done with blueprints from the automotive industry after claimed incidents. Or with the cost accounting of the utilities if power or gas appears to be too expensive. I don’t want to judge about the reasons – I have my personal opinion on this but that’s out of the scope of this post.

Looking at that situation from an enterprise perspective, it becomes obvious that information security has to move to the top of the CIO agenda (and the CEO agenda!) if it isn’t yet there (and given that the enterprise isn’t willing to share everything with the public – blueprints, calculations, whatever,…). That requires approaches which are somewhat more fine-grain than the once which obviously have been in place in the US government, allowing a private (or something like that, I’n not that familiar with the ranks in the US military) to access masses of documents. It also requires to efficiently protect the information itself instead of the information system only. Information tends to flow and once it is out of the system the system-level security doesn’t grip anymore.

That leads inevitably to the topic of Information Rights Management (IRM) which is a frequent topic in the blogs of Sachar Paulus and me – just have a look at our blogs. However, implementing IRM the typical way in organizations requires using centralized policies, classifications, and so on. And classification obviously failed in the last Wikileaks incident. Thus, I’d like to bring in an idea Baber Amin recently brought up in a discussion during a KuppingerCole webinar. He talked about “identity-based encryption” which in fact means encrypting it in a way which is controlled by the single user. That leads to an IRM where the single user controls who is allowed to use information he creates or owns. It is not (mainly) the organization.

But: Will that work? Some arguments and counter arguments:

  1. Information is not accessible once the user leaves the organization: Not correct, there might be an additional “master” key to allow recovery and so on. Many lessons could be learned from Lotus Notes in that area, to name an example.
  2. There are no corporate policies: Not correct, these could be understood as a second level of protection, adding to the first level managed by the user. E.g. classical IRM and personalized IRM could be combined.
  3. It won’t work because the user doesn’t understand what to do: Not correct. Just look at how users are dealing with information security in their daily live. For sure some things are going wrong and lessons have to be learned (not to appear drunken on a photo in Facebook, for example), but overall that works pretty well. Combined with the corporate policies, that should turn out to be much better than corporate policies only. Trust the employee and the wisdom of crowds.

Simply spoken: Think about doing it different than before. It is not about adding new tools at the (perforated) perimeter and all these point solutions. It is about building few consistent lines of defense, including and especially the next-generation IRM. For sure there is some way to go and tools aren’t there yet. But when thinking about how to protect your intellectual properties and the secrets your organizations wants to have (for whatever reason – I don’t judge here…), you should definitely think beyond the traditional approaches of IT security – look especially at Information Security instead of Technology Security, e.g. the I and not the T in IT.

When you think that this topic is worth to think about, you shouldn’t miss EIC 2011 - the conference on IAM, GRC, Cloud Security and thus also about things discussed in this post. And don’t hesitate to ask for our advisory services ;-)


Beyond LDAP – have a look at system.identity

20.06.2010 by Martin Kuppinger

LDAP (Lightweight Directory Access Protocol) is well established. It is the foundation for today’s Directory Services, which support LDAP as a protocol and which usually build their data structure on the associated LDAP schema. There are many interfaces for developers to use LDAP, from the LDAP C API to high-level interfaces for many programming environments.

Even while LDAP is well established, it is somewhat limited. There are several restrictions – two important ones are:

  • The structure of LDAP is (more or less) hierarchical. There is one basic structure for containers – and linking leaf objects (think about the association of users and groups) is somewhat limited. That structure is a heritage of X.500, from which LDAP is derived – with LDAP originally being the lightweight version of the DAP (Directory Access Protocol) protocol. X.500 was constructed by telcos for telcos, e.g. with respect to their specific needs of structuring information. However anyone who ever has thought about structuring Novell’s eDirectory or Microsoft’s Active Directory knows that there is frequently more than one hierarchy, for example the location and the organizational structure. The strict hierarchy of LDAP is an inhibitor for several use cases.
  • LDAP is still focused on the specific, single directory. It doesn’t address the need of storing parts of the information in fundamentally different stores. But the same piece of information might be found locally on a notebook, in a network directory like Active Directory, in a corporate directory and so on. How to deal with that? How to use the same information across multiple systems, exchange it, associate usage policies, and so on? That is out-of-scope for LDAP.

I could extend the list – but it is not about the limitations of LDAP. LDAP has done a great job for years but there is obviously the need to do the next big step. An interesting foundation for that next big step comes from Kim Cameron, Chief Identity Architect at Microsoft. He has developed a schema which he calls system.identity. There hasn’t been much noise around before. There is a stream from last years Microsoft PDC, there is little information at the MSDN plus a blog post, there is the Keynote from this year’s European Identity Conference. But it is worth to have a look at that. The approach of system.identity is to define a flexible schema for identity-related information which can cover everything – from local devices to enterprise- and internet-style directories, from internal users to customers and device identities, including all the policies. It is, from my perspective, a very good start for the evolution (compatibility to LDAP is covered) well beyond LDAP and today’s directories.

I’ve put the concept under a stress test in a customer workshop these days. The customer is thinking about a corporate directory. Most people there are not directory guys, but enterprise IT architects. And they definitely liked the path system.identity is showing. It covers their needs much better than the LDAP schema. That proved to me that system.identity is not only for the geeks like me but obviously for the real world. Thus: Have a look at it and start thinking beyond LDAP. The concept of system.identity, despite being early stage, is a very good place to start.


Is an insecure smart planet really smart?

25.03.2010 by Martin Kuppinger

There are a lot of talks about making our planet smarter. Despite being far too much fiction, the film “Die Hard 4.0″ has been around some of the potential risks around this. I recently had a very interesting discussion with a forensic/incident expert from the US. We’ve discussed several issues and ended around the idea of this “smarter planet” and the “smart grid” as one of its most prominent elements. Per se, the idea of having a networked infrastructure in many areas, with a high degree of flexibility and increased service availability is as appealing as inevitable – things will go that path.

However the security of that future seems to be somewhat ignored, at least in the public discussion. For sure politicians aren’t interested in the dark site of things as long as the bright side is discussed. They don’t want to be the party poopers. Only if there is an incident, they will claim that they have done everything to avoid it and that everyone else is guilty but not them. Vendors, on the other hand, are mainly interested in driving things forward. Most of the for sure don’t ignore security – but it seems to be more sort of a pain than an opportunity.

Thus, we observe currently the same thing in big like we can see day by day in small: Security is ignored when driving things forward. That is true for a tremendous part of the software which is developed, it is true for new standards in IT (think about web services – security has been missing at the beginning), it is true for so many other areas. And now the same thing seems to happen for all these smart things. But, from my perspective, then these things aren’t really smart.

Just think about the smart grids. This is sort of a massive data retention mechanism, collecting and networking millions of households with the utilities. There are privacy threats – who has used which electric device when? There are new attack surfaces. For sure there are some things going on around security. But from what I observe, security is developing slower than the rest of the things in the smart planet initiatives. It’s sort of a ticking time bomb out there.

What will happen? Security is undervalued. For sure it isn’t ignored but it won’t have the relevance it should have in these projects. People will cheer when there are some results of projects delivered. Security will become a problem. There will be unpleasant discussion about who is guilty or not. Security issues will be patched. To some degree. Wouldn’t it be a better idea to built security into the concepts from scratch? To really have a smarter planet at some point of time?

Sorry for being the party pooper!


Back to the basics – you still need “core IAM”

03.03.2010 by Martin Kuppinger

In these days the industry talks a lot about IT GRC, Risk Management, Access Governance, Identity for the Cloud, and so on. However, we should keep in mind that the vast majority of organizations still have to do a lot of homework around basic Identity and Access Management.  And, even more: That’s the foundation for many of the other things like Access Governance, because it’s not only about auditing but as well about managing (and, honestly, it’s much more about managing and enforcing preventive controls than of auditing in a reactive way, isn’t it?).

Thus, you shouldn’t ignore Identity Provisioning, Virtual Directory Services (still one of the most valuable technologies in IAM and one of the best hidden secrets at the same time), or Enterprise SSO. You will find a lot of Podcasts of Webinar recordings at our website. Thus, I won’t analyze everything around that but focus on some few points why we still should consider the core IAM market as relevant:

  • Provisioning tools have matured over the past years – and they support many of the “new” features like access certification frequently. Thus you can do a lot of things relying only on these “basic” tools instead of adding too much on top of them. Not all, but a lot. That has to be carefully analyzed but in several cases, one tool definitely is the better solution than multiple tools. That’s like in real life: There are advantages for the multi-tool, there are advantages for the specialized tools.
  • If you look at the market, than there are relatively few really big organizations. Most of them have some IAM. But, correctly, most of them have more than one IAM approach and implementation. Thus, they have integration issues which is an important market, with many architectural options to solve this. And, beyond that, in these large organizations you frequently can observe a tendendy to implement some point solutions in some areas – for example an additional provisioning tool for some specific systems. Given that, there is still a lot of work to do and a lot of potential, for example in providing the provisioning tool which integrates other provisioning tools.
  • The medium-sized businesses frequently don’t have much provisioning and other IAM solutions in place. Thus, there is a huge market opportunity, as well for on-premise as cloud-based solutions.
  • Some implementations might be worth a review with respect to today’s requirements and solutions. There is always room for updates and even replacements.

The reason why there is somewhat fewer attention of the marketing departments of vendors on that segment (at list when looking at some vendors which have not only provisioning) is simple: Provisioning is hard to sell. E-SSO is easier to sell. Access Governance might be even easier than that. Thus, looking at the low-hanging fruits instead of focusing on products with a long sales-cycle and a lot of competition, appears to be logical from a sales perspective. However, that leaves a large portion of the market blank and it doesn’t fill the pipeline sufficiently for a time where the low-hanging fruits might have been picked.

It’s not up to me to judge about vendor marketing and sales strategies. But it is interesting to observe what is happening in the market. And that might be one reason for the relative success of several of the smaller vendors in many markets (by the way: some large vendors are very active in the “classical” segments – innovative, focused,…).

From a customer perspective, the buzz and fuzz around the new topics might divert the focus from the things which have to be done as a foundation, on which other things can be built. Thus customers always should keep in mind that they can’t be successful without doing their homework. And that includes to provide a solid foundation for provisioning – with an adequate architecture for the customer’s requirements. I’ll blog about these architectures soon but you might as well look here - I’ve touched the topic in this webinar.

Don’t miss the European Identity Conference 2010 and its Best Practice presentations to learn more about this. See you in Munich, May 4th to 7th.


Identity Management and the Cloud

14.04.2009 by Martin Kuppinger

Cloud Computing will be the next big paradigm shift in IT. I have no doubt about that. But like with in many other cases, there is first of all a vision, then a buzzword, then some basic technology – and then people start to think about things like reliability and security. The same is true with Cloud Computing. There are many services out there, but IAM and GRC for the cloud are heavily underestimated.

That is somewhat funny given that some of these services appeared in the big New Economy bubble some ten years ago. Salesforce.com is just one example, some of the online conferencing providers are as well in the market for years now. But only few of them support at least basic standards like SAML (Security Assertion Markup Language) for Identity Federation. And many still lack the support for such standards, not to talk about more advanced approaches like Information Cards or XACML.

Beyond the fact of missing support for existing standards, there is the issue of missing standards. There are virtually no standards for GRC, for example for auditing and alerting (and SNMP isn’t the solution for the cloud). Even XACML is more sort of a technical standard, which needs a lot of additional work to really support the authorization management issues in the cloud.

There are some additional offerings for example for Single Sign-On to the cloud, there are some identity providers for the very lightweight OpenID and even less for Information Cards, and there are few offerings for Identity Provisioning from the cloud, e.g. managed services for Identity Management. Some of the more interesting vendors in the market are, amongst others, companies like Fischer (Provisioning), Ping Identity (Federation), TriCipher (Authentication), Arcot Systems (Authentication), Multifactor Authentication (again Authentication), and Fun Communications (Information Cards). But the number of offerings is still relatively small.

On the other hand it is obvious that IAM and GRC will become a very fast growing segment of the IT market, for ISVs as well as for Identity Providers. And it will be as well an interesting opportunity for consultants supporting all the other providers in the cloud in enabling their applications for the IAM and GRC requirements of their customers.

To become successful as a provider in the cloud, the “externalization” of the management of authentication and authorization as well as externalized auditing will become mandatory. Customers can’t afford to manage authorizations per cloud service but will have to apply pre-defined policies. Thus, we need new standards and we need new semantics for existing standards like XACML on a much higher level than today.

The entire industry, e.g. cloud providers as well as customers and IAM/GRC vendors have to work together on this. Feel free to send me your ideas and proposals on this – we’re currently preparing a launch of a standards initiative on some IAM/GRC issues and that might be the next one.

More on IAM and GRC for the Cloud at the European Identity Conference 2009 (Munich, May 5th to 8th).


Services
© 2014 Martin Kuppinger, KuppingerCole