Calendra is back – at least sort of

22.06.2011 by Martin Kuppinger

Do you remember Calendra? The vendor which was acquired by BMC many years ago? At least many existing and remaining customers require Calendra. And some of them really miss the company.

What made Calendra popular was their tool which allowed to quickly building applications to deal with information held in directories. That approach was different to provisioning, different to meta directories, and it was not just hard coding everything. Being a specialized IDE for database environment, it allows customer to quickly build directory-based applications for example to manage employee data or to implement specific approaches for delegated administrators.

Even while some few vendors try to fill the gap, there appears still to be sort of such a gap. But there is more than a silver stream at the horizon. The German company ITConcepts, a BMC partner (amongst others), has acquired the Calendra technology and is re-launching it. The product will be named Cognitum. The first version, available now, is more or less a rebranded version of the existing Calendra product. ITConcepts plans to quickly release an updated version with support for newer versions of Java stacks and some other maintenance, before moving forward with functional enhancements.

The good news for existing Calendra customers today is: Someone is working actively on that product again and provides support. And ITConcepts has a defined roadmap which looks realistic and doesn’t promise too much. However, given that ITConcepts has been a system integrator only until now, it will be interesting to observe how they execute in the software business. In any case, it is worth for Calendra customers to have a look at this offer. And chances are good that the gap in the overall IAM landscape which has been sort of left when Calendra was acquired by BMC will be filled again.


Data Protection Laws – Location or Information?

15.06.2011 by Martin Kuppinger

One of the intensively discussed issues in Cloud Computing is compliance with local data protection and privacy laws. The European laws, for instance, are sort of “location-dependent”. It is much easier to deal with PII (Personally Identifiable Information) within the borders of the EU than outside of that region. That is the reason why many large Cloud Providers build data centers within the EU to support their European customers.

The question which recently came to my mind is: Does it really make sense to focus on location? Shouldn’t we better focus on the information security itself? The target is to avoid abuse of PII and other sensitive information. The laws focus on processing, with a very broad definition of the term “processing”. Processing is allowed only if the providers are following specific rules. However: When we clearly define these rules, when we audit the providers, when we do certification – why should the location really matter?

You could argue that there are regions where you won’t expect the government to respect these rules. You could argue that there are countries like the US where some laws are contradictory to European laws. However, that all could be easily part of the rules defined in the law. There is no logical reason to do it by location. If you look at the way governments in some European countries act I wouldn’t say that location is the best choice for enforcing data protection.

From my perspective it would be a good idea to fundamentally re-think data protection laws and to define protection requirement levels for different types of PII and different types of processing. Then rules for the requirements external (cloud) providers have to fulfill can be defined – independent of the location. If one of these rules is contradictory to the local laws in the country the provider has its data center the result would be the same as today. But overall, we would end up with far more flexibility for the cloud.

However, IT reality is always far ahead of the laws. Thus we probably have to accept that it will take many years until the laws reflect the reality of today’s globally distributed, service-based IT.


Posted in Cloud, Privacy | 1 comment

SailPoint and BMC – how to move forward?

14.06.2011 by Martin Kuppinger

There has been a lot of FUD (Fear, Uncertainty, Doubt) regarding Control-SA. The product has been moved from BMC to SailPoint in spring 2011. But communication about the impact for customers has been weak (to use a positive term…). After several talks with both SailPoint and BMC I’d like to provide some information. First of all, SailPoint now owns Control-SA, including the support team and other related human resources. There even is a roadmap for Control-SA and support for the newer releases (ESS 7.5.x) will be provided for several years from now.

On the other hand, SailPoint IdentityIQ now is on the price list of BMC. It can be bought with BMC contracts, BMC 1st level support, and so on. It is the strategic solution for Access Governance and Identity/Access Management offered by BMC. BMC itself only focuses on BIRM (BMC Identity Request Management), not to be mixed up with BRIM (BMC Remedy Identity Management), which is no longer sold through BMC (but the relevant parts are either BIRM or SailPoint products (ex Control-SA) now.

SailPoint will soon provide its own provisioning engine, which is sort of a lightweight implementation, being controlled by the Access Governance (and Lifecycle Management) components of IdentityIQ and which uses the existing connectors of Control-SA. SailPoint additionally plans to release new connectors.

This gives customers a lot of choices to move forward. They can use Control-SA for quite a while, at least if they use ESS 7.5.x and higher. They might move to the SailPoint provisioning engine, using IdentityIQ on top and the existing connectors. They might migrate to other provisioning tools, and so on. But the most important thing is: Control-SA isn’t dead and customers can take their time to consider their options. And my advice is: take your time and think about how your IAM, Access Governance, and Service Request Management should look like in the future.

I’ve written a research note on “Access Governance Architectures” some 15 months ago. I talk about different architectural approaches for Access Governance – and many of them are relevant when rethinking your strategy and architecture around the three topics mentioned above. The most important point is: it is not about having exactly one central provisioning tool anymore. Provisioning tools are an important element, but a lot of companies struggle with standardizing on one tool. There might be tools in use for quite a while for specific environments, sometimes with a lot of customization – think about mainframe connectors. There are mergers and acquisitions, bringing in new tools. There are lobbies pushing specific solutions for the Microsoft Active Directory environment or the SAP infrastructure. There might be too complex IT infrastructures in large organizations, divided across many organization divisions.

That’s were integrating layers like Access Governance and/or Service Request Management come into play. They might become the glue for different provisioning systems. And they even enable you to easier make changes at the provisioning layer. Modular architectures are somewhat more complex architecture-wise and from the integration perspective, but they provide you more flexibility for changes.

Looking at Control-SA environments, putting such a layer on top (which might be Sailpoint IdentityIQ but could be another Access Governance tool, SRM tool, or portal as well) allows you to migrate Control-SA at your own pace to whatever you want – or to add other provisioning tools if required. This provides you the flexibility. And in most cases it is the better choice than just replacing one monolith with another one. By the way: that is true for all the other provisioning systems, which might have to be migrated at some point of time as well.

Thus: evaluate your options first. Build a future-proof architecture (as future-proof as one could be based on what is there today). Then decide on what to do with Control-SA when. This will give you more time for your decisions and you most likely will end up with a better solution. If you then end up with a pure SailPoint or a mixed SailPoint/BMC (BIRM) solution or with a mixed vendor solution or a solution purely provided by another vendor, depends on your requirements. But it should be a well-thought decision, not something done in a hurry.


Symlabs now part of Quest

08.06.2011 by Martin Kuppinger

Quest just acquired another vendor in the IAM market. Symlabs is definitely more sort of a “hidden gem”, a vendor not being very well-known. That isn’t that surprising given that Symlabs mainly focuses on Federation (somewhat popular) and Virtual Directory Services (not as popular as they should be).

From a Quest perspective, Symlabs adds some missing pieces to the more and more complete puzzle of the Quest Identity Management portfolio, the Quest One Identity solutions. Starting with some Active Directory-centric solutions some time ago, Quest has managed to build one of the broadest IAM portfolios in the entire industry by selectively acquiring vendors like Völcker Informatik or Symlabs – by the way both being European vendors.

The virtual directory technology allows to access data out of various sources like directories and databases and to flexibly consolidate this data to virtual directories, e.g. at runtime and without building yet another physical directory through (more complex) synchronization. I’m a strong believer in virtual directory services for several (not all!) use cases and my experience from a large number of advisory workshops with end users is that they all are interested in virtual directory services once they have learned about that type of technology. Thus, this non-intrusive technology not only enhances the capabilities of Quest to integrate with different directory services and to access the data therein but might also become a door-opener to new customers.

In addition Quest has now some own federation technology available, another cornerstone of IAM technologies. This will help Quest to expand its Single Sign-On and authentication offerings, but might as well help Quest to add (incoming) federation support as a standard feature to their other solutions.

From my conversations with Quest I know that they have a plan for IAM – and they are successfully on this, at least when it comes to acquisitions. However, the more Quest acquires, the more they will have to work on integration and on positioning themselves not as the vendor of a set of tools but of solutions. It will be interesting observing how Quest executes on that part of what should be in the plan.


Be prepared for BYOD

06.06.2011 by Martin Kuppinger

BYOD: Again one of these acronyms. It stands for “Bring Your Own Device”. You’d also say that it stands for IT departments accepting that they’ve lost against their users. They have lost the discussion about which devices shall be allowed in corporate environments. When I travel by train, I observe an impressive number of different devices being used. There are Windows notebooks, netbooks, iPads, iBooks, other types of “pads”, smartphones,…

For a long time corporate IT departments have tried to limit the number of devices to a small list, thus being able to manage and secure them. However, the reality especially in the world of mobile devices proves that most IT departments have failed. For sure many have restricted the access to corporate eMail to Blackberry devices. But many haven’t managed to achieve that target. And the popularity of Apple devices increases the heterogenity of devices being used by employees.

It increasingly looks like the solution only can be acceptance. Accept, that users want to use different types of devices. Accept that the innovation especially around smartphones and pads is far quicker than corporate IT departments can adopt their management tools.

At first glance that sounds like a nightmare for corporate IT departments. How to manage these devices? How to secure the devices? However, it is not about managing or securing the devices. That would be “technology security”. It is about managing and securing information, e.g. “information security”. It’s about the I in IT, not the T. Thus, we have to look at when to allow access to which information using which tool.

To do this, a simple matrix might be the starting point. The first column contains the classes of devices – notably not every single device. The first row contains the applications and information being used. In the cells you can define the requirements, based on the risk score of both the devices and the information. In some cases you might allow access based on secure browser connections, in others you might require to use virtual desktop connections. In others you might end up with having to build a specialized app. However, if banks are able to secure online banking on smartphones, why shouldn’t you be able to secure your corporate information on these devices?

You might argue that building apps or deploying desktop virtualization is quite expensive. However, trying to manage all these different devices or trying to restrict the devices allowed is expensive as well – and much more likely to fail. I don’t say that it is easy to protect your corporate information in a heterogeneous environment, supporting BYOD. But it is much more likely to be feasible than to manage and secure any single device – given the increasing number of these devices, the speed of innovation, and the simple fact that corporations don’t own all these devices.

Thus it is about preparing for BYOD by providing a set of secure paths to access corporate information and to protect that information – and by understanding how to protect which information where. When you start with BYOD, do it risk-based.


Services
© 2014 Martin Kuppinger, KuppingerCole