Don’t start with technology – but understand technology first

23.02.2011 by Martin Kuppinger

I still too frequently observe that organizations are too quick when it comes to technology decisions. In many organizations, there is first a decision that a “provisioning”, “web application firewall”, “single sign-on”, or even “identity management” is needed. Then some people google for these terms, find some vendors and decide about the solution. That fits to requests like “We’d like to have identity management running by the end of the year - could you support us?”

On the other hand I frequently observe that many customers aren’t aware of important technologies like Access Governance or Virtual Directory Services, to name just two of them. But if you don’t know what’s out there – how could you be sure that the solution you’ve chosen really is the best one?

Successful projects require as well a good understanding of which types of technologies are out there and which are best suited to support in solving specific problems (technology doesn’t solve the problems, but it can support in doing that). That, on the other hand, requires not only to understand the real problems (challenges, issues, threats,…) which have to be solved but as well understanding how to do that. That will lead to specific requirements and a knowledge about the mandatory requirements and priorities. It will also help to understand which of different overlapping technologies (or which part of them) is the best one to start with. Once you have done all this, defined some book of rules, processes, and so on, you can start with choosing the product within a specific category.

And yes, correct: That takes a little longer than just choosing the product. But it will lead to decision based on facts and not on uncertainty.


Lessons enterprises should learn from the recent wiki-leak

17.12.2010 by Martin Kuppinger

There has been a lot of discussion around Wikileaks publishing an incredible amount of data which has been classified as confidential by the US Government. I don’t want to discuss this from specifically – many people have done this before, with fundamentally different conclusions. More interesting is what this means for private organizations, especially enterprises. Wikileaks has threatened some of them: The russian oligopolies, the finance industry in general. That comes to no surprise. Wikileaks founder Assange rates them as “bad”,e.g. his enemies. Given that Wikileaks isn’t alone out there, there is an obvious threat to any enterprise. Some might think that construction plans of the defense industry should be published. Others might think that should be done with blueprints from the automotive industry after claimed incidents. Or with the cost accounting of the utilities if power or gas appears to be too expensive. I don’t want to judge about the reasons – I have my personal opinion on this but that’s out of the scope of this post.

Looking at that situation from an enterprise perspective, it becomes obvious that information security has to move to the top of the CIO agenda (and the CEO agenda!) if it isn’t yet there (and given that the enterprise isn’t willing to share everything with the public – blueprints, calculations, whatever,…). That requires approaches which are somewhat more fine-grain than the once which obviously have been in place in the US government, allowing a private (or something like that, I’n not that familiar with the ranks in the US military) to access masses of documents. It also requires to efficiently protect the information itself instead of the information system only. Information tends to flow and once it is out of the system the system-level security doesn’t grip anymore.

That leads inevitably to the topic of Information Rights Management (IRM) which is a frequent topic in the blogs of Sachar Paulus and me – just have a look at our blogs. However, implementing IRM the typical way in organizations requires using centralized policies, classifications, and so on. And classification obviously failed in the last Wikileaks incident. Thus, I’d like to bring in an idea Baber Amin recently brought up in a discussion during a KuppingerCole webinar. He talked about “identity-based encryption” which in fact means encrypting it in a way which is controlled by the single user. That leads to an IRM where the single user controls who is allowed to use information he creates or owns. It is not (mainly) the organization.

But: Will that work? Some arguments and counter arguments:

  1. Information is not accessible once the user leaves the organization: Not correct, there might be an additional “master” key to allow recovery and so on. Many lessons could be learned from Lotus Notes in that area, to name an example.
  2. There are no corporate policies: Not correct, these could be understood as a second level of protection, adding to the first level managed by the user. E.g. classical IRM and personalized IRM could be combined.
  3. It won’t work because the user doesn’t understand what to do: Not correct. Just look at how users are dealing with information security in their daily live. For sure some things are going wrong and lessons have to be learned (not to appear drunken on a photo in Facebook, for example), but overall that works pretty well. Combined with the corporate policies, that should turn out to be much better than corporate policies only. Trust the employee and the wisdom of crowds.

Simply spoken: Think about doing it different than before. It is not about adding new tools at the (perforated) perimeter and all these point solutions. It is about building few consistent lines of defense, including and especially the next-generation IRM. For sure there is some way to go and tools aren’t there yet. But when thinking about how to protect your intellectual properties and the secrets your organizations wants to have (for whatever reason – I don’t judge here…), you should definitely think beyond the traditional approaches of IT security – look especially at Information Security instead of Technology Security, e.g. the I and not the T in IT.

When you think that this topic is worth to think about, you shouldn’t miss EIC 2011 - the conference on IAM, GRC, Cloud Security and thus also about things discussed in this post. And don’t hesitate to ask for our advisory services ;-)


Finally: Novell is sold

23.11.2010 by Martin Kuppinger

I’m following Novell for more than 20 years right now. And for roughly the same period of time there have been rumours of other companies acquiring Novell. But it never happened. Not really, at least. You could argue that the acquisition of Cambridge Technology Partners was sort of a takeover of Novell by Cambridge, with Jack Messman becoming CEO and so on. But at the end, Novell was at its own again. But yesterday the news spread that Attachmate is buying Novell – finally they are sold. Attachmate will keep Novell as separate business unit and maintain the brands of Novell and Suse. With other words: There won’t be that many changes from a customer perspective at first glance.

When looking at Attachmate and NetIQ, it becomes obvious that Attachmate at that point of time is keeping the acquisitions somewhat separate. There is still a NetIQ website and the NetIQ brand is still maintained. Behind the scenes, there is integration – but not when facing to the customer. It is most likely that the same strategy will be followed with Novell.

However, the questions are whether, when, and how Attachmate will start to build on the potential of tighter integration between their different “divisions”, e.g. the classical Attachmate, NetIQ, and Novell. There is a significant potential for integration – look at the broad support for different environments, from the mainframe to NetWare, Linux, and Windows. Look at the expanded capabilities for managing networks, delivered by NetIQ and Novell. And think about what the outcome for “intelligent workload management”, e.g. the optimization and management of workloads in virtualized/cloud environments could be if all the strengths of Attachmate, NetIQ, and Novell are put together. Thus, there is some interesting potential for the future.

The question I have fully answered is: What does this mean for existing Novell customers and what should they do? The answer at that point of time is simple: Stay calm and proceed as planned. There is no reason to go away from Novell – in contrast: Novell is now part of a significantly larger organization and it finally has been acquired, thus the rumours around acquisitions are past. And the opportunities out of this acquisition for existing Novell customers are significantly greater than the risks – especially if Attachmate starts to leverage the potential synergies between the different companies within that conglomerate.


Cloud Computing is mainly Service Management

28.10.2010 by Martin Kuppinger

When looking at all the discussions around the “cloud” I still miss some focus on the real essentials of a strategic (!) approach for using clouds. Clouds are, when looking at the right now common understanding of private, hybrid, and public clouds, in fact nothing else than IT environments which produce IT services. These services are provided at many different layers, like in the common (and pretty coarse grain) segmentation into SaaS, PaaS, and IaaS. But: It is about the (efficient, scalable,…) production of standardized, reusable services.

Cloud Computing is about using these services. It is about procurement, management, orchestration, accounting, and so on. With other words: Cloud Computing is mainly about service management, in a standardized way. In a perfect world, all services of all products (internal and external) would be managed consistently. There could be one consistent accounting, ending up with something like an ERP for IT. However, the service management aspect of Cloud Computing appears not to be in the centre of most discussions around Cloud Computing. Many discussions are just about tactical comparisons and views of parts of Cloud Computing. Many discussions are around security. But about service management, the really strategic thing? The part which will fundamentally change the way we are doing IT?

For sure there is a lot of discussion around service management today. ITIL is a good example. However, that covers just a part of IT. We have to look at it from the highest layer (business and its requirements, described as real business services like “managing contracts of type … in compliance with regulations and…”) down to granular web services used in SOA architectures. Services are sort of everywhere. And the future of IT is about having two layers:

  • Service production (In the Clouds)
  • Service consumption (Cloud Computing)

That requires fundamental changes in IT organizations. The core competency is to become best in class in mapping business requirements to the required services, e.g. in doing the “cloud computing” part right. For the “production” part of IT, it is about becoming best in class in providing efficient services. But typical IT organizations will be split into two parts: Consumption/Orchestration/Management and so on – and production in the private cloud environment. Enabling this shift is the key issue for any organization today.

You might now argue “what about security?”. Pretty easy: Security is a part of this. Every service has a functional part and a “governance” part: Where is the service allowed to run due to compliance? What about encryption of transport and data? Who is allowed to access the service (or parts of it)? And so on… With other words: When you’ve solved the service management piece, you’ve automatically solved at least a large portion of the security piece. You might argue that there are some infrastructural aspects not covered by this (how to enforce what you need for service governance). But that could be understood as well as part of your service environment.

A lot of aspects around Clouds, Cloud Computing, Cloud and Services, Cloud Security and so on will be discussed at EIC 2011/Cloud 2011 in Munich, May 10th to 13th.


Cloud, Automation, Industrialization

21.07.2010 by Martin Kuppinger

Cloud Computing is still a hot topic. And there are still many different definitions out there. I personally tend to differentiate between two terms:

  • Cloud: An IT environment to product IT services.
  • Cloud Computing: Making use of these services – procurement, orchestration, management,…

Thus the internal IT can be understood as one of many clouds, there might even be multiple internal clouds. But we don’t have to care that much about internal, external, public, private, hybrid,… The prerequisite for an IT environment to be understood as a cloud is the service orientation, e.g. the production of well-described services. That might be done in a more or less scalable way – but it is about services.

Read the rest of this entry »


Beyond LDAP – have a look at system.identity

20.06.2010 by Martin Kuppinger

LDAP (Lightweight Directory Access Protocol) is well established. It is the foundation for today’s Directory Services, which support LDAP as a protocol and which usually build their data structure on the associated LDAP schema. There are many interfaces for developers to use LDAP, from the LDAP C API to high-level interfaces for many programming environments.

Even while LDAP is well established, it is somewhat limited. There are several restrictions – two important ones are:

  • The structure of LDAP is (more or less) hierarchical. There is one basic structure for containers – and linking leaf objects (think about the association of users and groups) is somewhat limited. That structure is a heritage of X.500, from which LDAP is derived – with LDAP originally being the lightweight version of the DAP (Directory Access Protocol) protocol. X.500 was constructed by telcos for telcos, e.g. with respect to their specific needs of structuring information. However anyone who ever has thought about structuring Novell’s eDirectory or Microsoft’s Active Directory knows that there is frequently more than one hierarchy, for example the location and the organizational structure. The strict hierarchy of LDAP is an inhibitor for several use cases.
  • LDAP is still focused on the specific, single directory. It doesn’t address the need of storing parts of the information in fundamentally different stores. But the same piece of information might be found locally on a notebook, in a network directory like Active Directory, in a corporate directory and so on. How to deal with that? How to use the same information across multiple systems, exchange it, associate usage policies, and so on? That is out-of-scope for LDAP.

I could extend the list – but it is not about the limitations of LDAP. LDAP has done a great job for years but there is obviously the need to do the next big step. An interesting foundation for that next big step comes from Kim Cameron, Chief Identity Architect at Microsoft. He has developed a schema which he calls system.identity. There hasn’t been much noise around before. There is a stream from last years Microsoft PDC, there is little information at the MSDN plus a blog post, there is the Keynote from this year’s European Identity Conference. But it is worth to have a look at that. The approach of system.identity is to define a flexible schema for identity-related information which can cover everything – from local devices to enterprise- and internet-style directories, from internal users to customers and device identities, including all the policies. It is, from my perspective, a very good start for the evolution (compatibility to LDAP is covered) well beyond LDAP and today’s directories.

I’ve put the concept under a stress test in a customer workshop these days. The customer is thinking about a corporate directory. Most people there are not directory guys, but enterprise IT architects. And they definitely liked the path system.identity is showing. It covers their needs much better than the LDAP schema. That proved to me that system.identity is not only for the geeks like me but obviously for the real world. Thus: Have a look at it and start thinking beyond LDAP. The concept of system.identity, despite being early stage, is a very good place to start.


Is an insecure smart planet really smart?

25.03.2010 by Martin Kuppinger

There are a lot of talks about making our planet smarter. Despite being far too much fiction, the film “Die Hard 4.0″ has been around some of the potential risks around this. I recently had a very interesting discussion with a forensic/incident expert from the US. We’ve discussed several issues and ended around the idea of this “smarter planet” and the “smart grid” as one of its most prominent elements. Per se, the idea of having a networked infrastructure in many areas, with a high degree of flexibility and increased service availability is as appealing as inevitable – things will go that path.

However the security of that future seems to be somewhat ignored, at least in the public discussion. For sure politicians aren’t interested in the dark site of things as long as the bright side is discussed. They don’t want to be the party poopers. Only if there is an incident, they will claim that they have done everything to avoid it and that everyone else is guilty but not them. Vendors, on the other hand, are mainly interested in driving things forward. Most of the for sure don’t ignore security – but it seems to be more sort of a pain than an opportunity.

Thus, we observe currently the same thing in big like we can see day by day in small: Security is ignored when driving things forward. That is true for a tremendous part of the software which is developed, it is true for new standards in IT (think about web services – security has been missing at the beginning), it is true for so many other areas. And now the same thing seems to happen for all these smart things. But, from my perspective, then these things aren’t really smart.

Just think about the smart grids. This is sort of a massive data retention mechanism, collecting and networking millions of households with the utilities. There are privacy threats – who has used which electric device when? There are new attack surfaces. For sure there are some things going on around security. But from what I observe, security is developing slower than the rest of the things in the smart planet initiatives. It’s sort of a ticking time bomb out there.

What will happen? Security is undervalued. For sure it isn’t ignored but it won’t have the relevance it should have in these projects. People will cheer when there are some results of projects delivered. Security will become a problem. There will be unpleasant discussion about who is guilty or not. Security issues will be patched. To some degree. Wouldn’t it be a better idea to built security into the concepts from scratch? To really have a smarter planet at some point of time?

Sorry for being the party pooper!


Services
© 2014 Martin Kuppinger, KuppingerCole