14.10.2013 by Martin Kuppinger
One of the challenges many organizations are facing in their IAM infrastructure is “Identity Information Quality”. That quality, especially in larger organizations, varies depending on the source it comes from. This challenge is not limited to the enrollment process, but also all subsequent processes. While the creation of new digital identities in IAM systems (at least for employees) is frequently driven primarily through imports from HR systems, changes of attribute values might be triggered from many different sources.
Many organizations spend a lot of time and money to improve HR processes to achieve a higher level of Identity Information Quality. That clearly makes sense, especially in the context of HR standardization initiatives. However, even the best processes will not deliver perfect Identity Information Quality.
So the question is: Why not use the recertification capabilities of Access Governance tools to improve Identity Information Quality? Why not let the departmental manager or the user themselves recertify certain attributes? This would be just another type of recertification campaign. Recertification in Access Governance is here because the Access Management processes are error-prone. If these processes worked perfectly well, no one would need recertification. The same is true for digital identities and their attributes, i.e. for Identity Information Quality.
When looking at other types of digital identities such as the ones of partners and customers, organizations might need other approaches to improve Identity Information Quality. When it is about partners, self-certification and recertification by the contact persons of the business partners might work. However, there is no need for that where Identity Federation is used – in that case, it is the responsibility of the business partner’s organization to enforce Identity Information Quality.
In the case of consumers, the option of self-certification – the option to review “account information” – might be one approach. Depending on the sales model, key account managers also might recertify their accounts. Furthermore, there is an increasing number of openly available information sources such as Facebook that under specific circumstances allow access via Graph APIs. These can be used to verify identity information.
But back to the employees: to me, it appears just logical to recertify the identity and not only the access information.
23.07.2013 by Martin Kuppinger
Access Intelligence, sometimes also called Identity and Access Intelligence (IAI), is one of the hype topics in the Identity and Access Management (IAM) market. Some vendors try to position this as an entirely new market segment, while others understand this as part of Access Governance (or Identity and Access Governance, IAG).
The first question is what defines IAI. From my perspective there are two major capabilities required to call a feature IAI:
- It must use advanced analytical techniques that allow for a flexible combination and analysis of complex, large sets of data.
- It must support the analysis not only of historical and current access entitlements, but also of access information in context and based on actual use, ideally in run-time.
The first requirement is tightly related to the second one. IAI clearly cannot just rely on traditional reporting mechanisms. Analyzing more data and working with more complex data models will require other technologies, specifically Business Intelligence/Analytics and Big Data technologies.
The second requirement extends the current reach of Identity and Access Governance. IAG traditionally focuses on the comparison of as-is and to-be information about access entitlements in various systems. It also provides reporting capabilities on the current state of these entitlements, including information, for example, about high risk accounts etc.
IAI goes far beyond that, though. It should also enable analysis of the actual use of data, not only of the entitlements. Which documents have been used based on which entitlements? Is there high-risk information people try to access without sufficient entitlements? This analysis is based on information from various systems such as User Activity Monitoring (UAM), server log files, DLP (Data Leakage Prevention) systems, etc. It also can provide information back to other solutions. Access Intelligence thus becomes an important element in Information Stewardship.
IAI helps in moving from a static view to a dynamic view, especially once it supports real-time analytics. One could argue that this leads to an IAM version of SIEM tools (Security Information and Event Management). I’d rather say that it goes beyond that, because it combines IAG with IAI.
Identity and Access Analytics is just a logical extension and part of IAG tools. It allows for better governance. Thus, this should not be a separate set of products but become a part of every IAG solution. It is, by the way, only one of the areas where IAG has and will change. In my presentation about “Redefining Access Governance: Going well beyond Recertification” at EIC 2013, I talked about eight areas of advancement for IAG – and I admittedly missed one in that list that I covered in other presentations, which is IAG for Cloud Services. The video recording of the session is available online.
More information about the current state of the IAG market is available in the KuppingerCole Leadership Compass on Access Governance.
08.07.2013 by Martin Kuppinger
Today RSA Security, a part of EMC [officially it’s “RSA, The Security Division of EMC”], has officially announced the acquisition of Aveksa, a company based in Waltham, MA. The deal closed on July 1st, 2013. Aveksa is a leading provider in the area of Identity and Access Governance (IAG), as depicted in our KuppingerCole Leadership Compass on Access Governance. Aveksa will continue to operate under the current leadership of its CEO Vick Viren Vaishnavi and will be part of the RSA Identity Trust Management business. Aveksa currently has approximately 175 employees.
One might ask why RSA did not enter the “core IAM” business earlier, when it was mainly Identity Provisioning, but for some years now that core has been complemented by and shifted towards IAG. Many people had expected such a move from RSA, given that they deliver in several other areas of the IAM market, including Strong Authentication, Versatile Authentication, Access Management and Federation. With the Aveksa acquisition, RSA definitely has made a move in that direction.
Instead of focusing on the traditional Identity Provisioning market, they focused on the emerging IAG market segment. Aveksa delivers some built-in provisioning capabilities but clearly does not have the breadth of connectors that the key-players in that market segment provide. However, with IAG increasingly becoming an integration layer for existing “legacy” provisioning tools, Aveksa has emerged as a major player. By adding some provisioning capabilities, customer requirements can be typically covered. Aveksa builds here on an enterprise-grade approach based on an Enterprise Service Bus (ESB) as the transport layer. Support for manual fulfillment is another important approach. Simply said: The number of connectors is not the key decision guage. The main measure is the support for a structured and user-friendly approach to Access Request Management, Recertification, and Access Analytics, including the underlying Enterprise Role Management.
However, the real potential of that acquisition is not that RSA as of now can provide a solution for IAG. The potential is in combining the capabilities of both companies to open new grounds for Access Governance, beyond that which is common today. In my presentation about “Redefining Access Governance: Going well beyond Recertification” at EIC 2013, I talked about eight areas of advancement for IAG – and I admittedly missed one in that list that I covered in other presentations, which is IAG for Cloud Services. The video recording of the session is available online.
There is much room for improvement. Aveksa is a strong player in IAG. RSA adds not only Access Management and Federation, but strong and versatile authentication. And there is RSA Archer, an Enterprise GRC solution. The combination of RSA and Aveksa is, by the way, the only one in the market where strong authentication and IAG come together in one vendor. That will allow creating Access Governance for risk- and context-based authentication and authorization, the next big trend in IT. My colleague Dave Kearns and myself both talked about that topic at EIC 2013 and Dave will do a Webinar on this topics later this month. Governing the rules for such environments and adding analytics for that is a field of high interest. And this is clearly not the only area where both companies can leverage synergies, given the tight relationship between cyber-attacks and Access Management and Analytics.
RSA and Aveksa have started talking about some promising ideas, even in the context of EMC. EMC can add Big Data capabilities that allow moving IAG to the next level when it comes to analytics. And not only that: Combining authentication information, external threat intelligence, risk analytics etc. – all in the combined portfolio – might lead to game-changing offerings.
So there is strong potential. Let’s see whether, how, and when RSA delivers on this potential. Still, when looking at acquisitions the other important question is: What does it mean to existing customers? The good thing is that there is virtually no overlap between the current product portfolios of these two companies. Thus, there are no products that are likely to be discontinued. In fact, for RSA customers there is really the chance for new and advanced offerings. For existing Aveksa customers, the acquisition means that their supplier right now is not a niche player anymore but part of a far larger vendor, with substantial financial backing and a far broader portfolio. Thus, there is a strong potential that this turns out to be positive for existing Aveksa customers.
But as always: Only time will tell.
10.12.2012 by Martin Kuppinger
Recently, there was news here and here that a disgruntled technician of the Swiss spy agency NDB (Nachrichtendienst des Bundes) had stolen terabytes of counter-terrorism information shared between the NDB, the CIA, and MI6 (the UK spy agency). The person has been temporary arrested. It is still unclear whether he has already sold some of that information or not.
This case, together with many others like the theft of data from Swiss banks, which then is sold to German tax offices, again highlights that the biggest security risk for most organizations comes from internals. There is no doubt that the number of external attacks is increasing. There is no doubt about a massive risk for critical infrastructures. There is no doubt that also manufacturing and, in general, SCADA devices are at far higher risk than before.
However, there are two important aspects to consider:
- Many internals have privileged access, frequently with a lack of control. They potentially can steal large amounts of data and cause massive harm.
- Many of the external attacks are in fact hybrid attacks, involving internals.
For organizations, this means that they should not focus only on external attacks. The concept of perimeter security is an illusion anyway. There is no such thing as “the perimeter around the organization” anymore. What organizations have to do is to move forward to protect information, regardless from where it is accessed, where it resides, which device is used, and whether it is accessed from internals or externals. Point solutions which claim to solve this issue won’t help without the bigger picture in mind. They just increase the risk of bad investments.
However, there are some things you have to do: Access Governance and Intelligence are one of these things. Privilege Management is another one. However, Privilege Management should be well-integrated with Identity Provisioning and Access Governance/Intelligence instead of being a point solution. The most important thing to do now is to understand the big picture of information security. That’s what you should put on top of your agenda for 2013.
To learn how to best establish Information Stewardship as a principle, you should have a look at our new report “From Data Leakage Prevention (DLP) to Information Stewardship”, #70587, which has been written by my colleagues Mike Small and Dave Kearns.
27.10.2011 by Martin Kuppinger
In a recent briefing with CrossIdeas, the MBO of the former Engiweb, an Italian software manufacturer in the area of Access Governance and Dynamic Authorization Management, they demonstrated an interesting feature: Doing recertifications based on relevance. Recertification of access rights is a key element of regulatory compliance. This is done frequently on a pretty standardized schedule. Doing this once or twice a year is the typical approach. For some specific systems or groups of users, we frequently see that the intervals are shorter, e.g. some risk-oriented approach is not uncommon. However, cynics might say that the main purpose still is to make the auditors happy.
CrossIdeas now has implemented an approach they name “relevance”. Based on several criteria like the number of SoD violations, the system identifies the most relevant users for recertification. Currently it supports six different parameters. The weight of these parameters can be easily changed using sliders. The least relevant users then can be removed – again using a slider – from the result set (a relevance map), leaving only the relevant ones in there. Then recertification can focus specifically on them.
This feature isn’t a full replacement for standard, regular recertification campaigns (which are supported by CrossIdeas IDEAS – the latter the name of their product) as well. Relevance is, from my perspective, a nice concept which brings value to customers because they can easily implement focused recertification campaigns for the most relevant users in addition to standard recertification. That then not only makes the auditor happy, but helps in better mitigating access risks. Not that standard recertification doesn’t help – but there is room for improvement and CrossIdeas has demonstrated an approach to do that which will be available in the new release due later this year.
23.09.2011 by Martin Kuppinger
Today Microsoft announced that they have acquired technology assets from BHOLD, a dutch vendor of Access Governance technology. Microsoft thus now owns technology which has been missing in their IAM portfolio until now. Microsoft thus enters the Access Governance market. Whether that will happen through enhancements of their existing FIM 2010 product or by adding another product based on the BHOLD technology hasn’t been announced yet. Anyhow, the deal will change the Access Governance market, particularly regarding the offerings which are targeted to complement Microsoft FIM.
KuppingerCole will follow up on this news and provide further information as soon as it is available. Overall, this acquisitions proves that Microsoft continues investing in the broader IAM space and thus rates this market segment as important to their customers. For existing BHOLD customers, the acquisition provides new opportunities given that they are working with a much bigger vendor now. However, the impact on existing customers can be rated first when the Microsoft roadmap is unveiled. In general we recommend existing BHOLD customers to stay calm until more information is available. For customers investing or planning to invest into FIM 2010, the acquisition is definitely good news because it means that FIM will grow beyond the somewhat technical approach into a more business-oriented solution over time. However, without the roadmap being unveiled it is hard to predict when Microsoft customers really will benefit.
15.09.2011 by Martin Kuppinger
Today, the next story about banks failing in managing trading risks hit the news. It remains unclear what allowed the trader to execute unauthorized (and thus most likely illegal) transactions which lead to that loss. However, the Risk Management of UBS obviously failed. By the way: UBS had to annouce that just the day the swiss parlament started a debate about new regulations for the finance industry.
It will be interesting to hear about why that could happen. Did some people co-operate? Did the risk management system specifically for that types of transactions fail? Or has it been an Access Management problem like at SocGen some time ago, where the trader was able to control himself? Whatever the reason is, the incident proves that there is still a long way to go in Risk Management and overall GRC – not only in the finance industry.
GRC is a key task for the C-level management. It needs sufficient funding. It needs support for the organizational changes, to build an organization with a high degree of process maturity and the understanding of the GRC requirements. It needs a strategic approach to integrate Business and IT to optimally support GRC, given that most business relies on IT systems and fraud in these systems causes the most severe harm. It needs an organizational and an IT-architectural approach to be able to manage different regulations and all types of risks in a structured and efficient way.
For the ones thinking about how to move forward in GRC, today’s KuppingerCole webinar might be worth to attend. It won’t answer all questions, but it will provide some valuable hints for moving forward in GRC. For sure, this is a long journey. But I strongly believe that it is feasible to avoid incidents like the one which happened now at UBS – and to mitigate the overall risks for organizations by a strategic GRC initiative (instead of point solutions).
17.08.2011 by Martin Kuppinger
During the last years, there has been a lot of change in the Identity Provisioning market. Sun became part of Oracle, Novell is now NetIQ, BMC Control-SA is now at SailPoint, Völcker has been acquired by Quest, Siemens DirX ended up at Atos. These changes as well as other influencing factors like mergers & acquistions, failed projects, and so on lead to situations where customers start thinking about what to do next in IAM and around provisioning. Another factor is that sometimes provisioning solutions are implemented with focus on specific environments – SAP NetWeaver Identity Management for the SAP ecosystem, Microsoft FIM for the Active Directory world. Not that they only support this, but they might be just another provisioning system. In addition, especially in large organizations it is not uncommon that regional organizations start their own IAM projects. The result: There are many situations in which organizations think about what to do next in provisioning.
However, just moving from product A to product B is not the best approach. In most cases, the deployment of provisioning tools took quite a while. In many cases there have been lot of customizations been made. And even while there might be some uncertainty about the future of the one or other product (or, in some cases, the certainty that the product will be discontinued sometimes in the future), just migrating from one provisioning tool to another seems to be quite expensive for little added value.
From my perspective, it is important for organizations to move at their own pace. The approach to do that is to put a layer on top of provisioning systems. I’ve described several options in a research note (and some webinars) quite a while ago. The research note called “Access Governance Architectures” describes different approaches for layered architectures on top of provisioning products. I’ll write an update later this year but the current version illustrates the basic principle well. By adding a layer on top of provisioning, which might be Access Governance, a Portal/BPM layer, or IT Service Management (or a mix), organizations can deal with more than one provisioning tool. The architecture is more complex than just using one provisioning tool. But if you are not able to rely on one provisioning tool only, its at least an approach that works.
Organizations then can for example replace provisioning tools fully or partially. The latter is quite common if complex customizations have been made for selected target systems. Organizations can deal with multiple provisioning systems that “just appeared” for some reason - M+A, specific solutions for a specific part of the IT ecosystem, or whatever. And they can move forward more flexible than in a monolithic architecture. Yes, these approaches require some more architectural work at the beginning, but that pays off. It pays off by more flexible migrations, by avoiding migrations at all, by less “political” conflicts with some of the lobbies within IT. It even enables to change the integration layer without affecting the underlying provisioning systems. And for sure it allows to interface with target systems in a flexible way, not only using provisioning tools but service desks or other types of connectivity if required.
But, at the end, the most important thing is that it allows customers to move forward at their own pace. Thus, before you think about migrating away from your current provisioning tool, think about how you can save your investments and add value – by new functionality and by business-centric interfaces of Access Governance and the increased flexibility of your IAM environment.
10.08.2011 by Martin Kuppinger
Is there a mismatch between the reality in organizations and the implementations of at least several of the Identity Provisioning and Access Governance solutions when it comes to the representation of physical persons in IT? To me it appears that there is a mismatch.
The reality in all large organizations I know is that the real world is sort of 3-tiered:
- There is a physical person – let’s call him Mr. X
- Mr. X can act in very different contexts. You might call them roles or digital identities, however all of these terms are overloaded with meanings. I’ll give three examples for that. 1. Mr. X might be an employee of an insurance company and a freelance insurance broker for the insurance company at the same time. 2. Mr. X might be an employee of a bank and a customer. 3. Mr. X might be the managing director of both company ABC, Inc. and DEF, Inc., which both are owned by XYZ, Ltd where he is employed as well.
- In each of these contexts, Mr. X might have more than one account. If he acts as external freelance insurance broker or customer, that might only be one account. If he is the managing director of some corporations within a group, he might have different Active Directory accounts, different RACF accounts, different SAP accounts, and so on.
You might argue that these are exceptions. However, being a customer of the employing company isn’t an exception in many organizations. And, by the way: A good and valid model has to support not only a standard approach but the exceptions as well. With other words: There are few situations in which a real-world model isn’t 3-tiered.
And there are good reasons to model the systems according to that. If someone is a customer of a bank and an employee, there are very obvious SoD rules which apply to this. He shouldn’t give loans to himself. If someone is a freelance insurance broker and an employee of the insurance, the same is true. He shouldn’t manage the insurance contracts he is selling. If someone is a customer and and employee, it’s the same again. He shouldn’t give discounts, grant consumer loans, or just change the delivery queue.
However, several tools follow a 2-tiered approach. They know for example an “identity” and “accounts” or “users” which are associated with the identity. If someone has more than one such identity, the problems begin. In some cases, it is very easy to adopt the object model. In others, you have to find workarounds like mapping the unique IDs of these identities into the other identities, which then might require a lot of additional code and is error-prone.
From my perspective, supporting a 3-tiered model out-of-the-box, with
- Context, Identities,… (whatever term you prefer)
- Users (in specific systems), accounts,… (again – choose your term)
is mandatory to reflect the reality in organizations and to support the business requirements – especially when it comes to SoD policies. If you don’t need three tiers, it is easy to just use two of them. But if your tool supports only two tiers out-of-the-box, it might become a tough task to implement your real-world model. Looking at that point is, from my perspective, one of the most critical aspects when it comes to technology decisions.
28.07.2011 by Martin Kuppinger
Access Governance tools are becoming standard in IAM infrastructures. However, they mainly focus on “static” access controls, e.g. the entitlements granted to a user based on roles and other paradigms. Recertification is supported by these tools, and the solutions are maturing quickly. Thus, that part of Access Governance is easy to solve.
However, the next wave is coming with the increasing success of tools which are commonly called Entitlement Servers or Policy Servers. I tend to call them Dynamic Authorization Systems because they authorize based on rule sets and attributes at runtime. While the rules are set, the attributes are changing. I’m a strong believer in these tools and in XACML als the underlying standard for communication between the different modules and in heterogeneous environments.
But: What about Access Governance for these environments? Some of the Access Governance tools support that to some degree, allowing to pre-evaluate some business rules which use defined roles or attributes. However, many rules – especially business rules like “users of the life insurance backoffice with the role xxx and the defined constraint for signing payments up to 50,000 € are allowed to sign that type of claim” are out of scope. There is some support for testing such rules for example provided by Axiomatics.
However, I don’t see a solution which provides integrated Access Governance for all types of entitlements. Given that Dynamic Authorization Systems gain momentum, its just a matter of time until auditors will ask for such solutions. These solutions should, like modern Access Governance tools, support the lifecycle management for the policies including approvals, auditing and analysis, and the recertification of such rules. That is more complex than what is done today. But, without any doubt, we will need this soon.
It will be interesting to observe who becomes the leader in that market. The vendors in the market of Dynamic Authorization Systems themselves? The Access Governance vendors? New startups?
By the way: The topic isn’t that new – look here.