The German data protection law starts to bite

29.10.2009 by Martin Kuppinger

The Deutsche Bahn has been sentenced to a penalty of 1,1 Mio Euro for breaches of the German data protection law, e.g. the privacy regulations in Germany. That is the record penalty based on the BDSG (Bundesdatenschutzgesetz), how the law formally is called. The reason for that penalty were abusive analysis of employee data, to identify potential cases of corruption and fraud. Data of bank accounts of suppliers and employees were compared. That became public, there was a lot of public discussion about – the topic was top in the news for several days. And the CEO, Hartmut Mehdorn, was (factually) fired.

However, dealing with corruption and fraud is a must for the management of any corporation. Heinrich von Pierer, the former CEO of Siemens, had to leave the company because he didn’t address corruption and fraud. Hartmut Mehdorn did it – and lost as well. Obviously, there are regulations in conflict. The problem of both was that they had no valid concept of which regulations are relevant, which are in conflict and how to deal with these conflicts. The Bahn analyzed far too much data and didn’t put that approach into a bigger concept, openly discussing it with the works council and so on.

So one lesson which should be learned by everyone with responsibility for compliance regulations (and the BDSG is one of them) is: Analyze the relevant regulations, clearly define the valid approach to deal with, discuss it with the works council as far as employee data is affected, talk with your auditors – in fact have a strategic approach on how to operationalize the regulations.

The second interesting aspect around the “Bahn” case is that the penalty is a record penalty – and only 1.1 million Euro, which is sort of paid out of the petty cash. Thus it hurt some people at the Bahn, loosing their jobs. But it is only a small penalty from the perspective of the large corporation. It seems that the BDSG is sort of a “law that has no teeth” (in German the saying is “toothless tiger”…). But there is good news (from the perspective of enforcing privacy and data protection): The new amendments of the BDSG will change things fundamentally – the tiger will get teeth.


Posted in GRC, Privacy | Comments Off

Social networks could be secure!

22.10.2009 by Martin Kuppinger

Yesterday, I read an article at a German news web-site about the recent security leaks found in the social network SchülerVZ. The article claims that social networks like SchülerVZ and Facebook (both are mentioned) don’t have any chance to avoid crawlers accesing personal data which should be presented only to friends. Ridiculous!!!

Sorry, that is definitely nonsense!

It is very simple. You have some data which is visible only to some specific persons. You have an authorization policy, which might be expressed in the form of ACLs or XACML or whatever. Some application (the regular frontend, a crawler, an administrative application,…) tries to access data. You have done an authentication. You do the authorization by comparing the authentication information to the authorization information. You decide on whether access is allowed or not. That is done in millions of applications day-by-day. And that shouldn’t work with social network sites? I don’t see any real reason why!

For sure there are two reasons why at least some social networks don’t do that in this way:

  • Bad software architecture: Security has to be done by design, from the very beginning. Otherwise it is hard to implement it. Unfortunately, many developers don’t design security in their products but add it at the end, as something painful they have to do at the minimum level.
  • Performance considerations: For sure security will affect performance. For any access, you will have to do security checks. You will even have to provide stronger authentication features. But it can be done. Providers will probably require some more hardware to keep the performance level of their social networks. But security has its price.

But to be honest: These aren’t valid reasons. Either you are able to deploy a social network in a secure way and fulfill the data protection laws. Or you should shut the entire thing down. Given that it is possible to secure social networks, the operators should be fully responsible for any security breach.

By the way: Even the databases themselves can be fully secured. That depends a little on the database chosen and the additional technologies in place, like Oracle’s Database Security products (to mention one of the more advanced solutions). OK, that will again cost you some performance and some money. But again it is about “security first”. If the providers of social networks can’t afford the cost of security, their business model just doesn’t work.


XACML – why it is so important

22.10.2009 by Martin Kuppinger

XACML (eXtensible Access Control Markup Language) gains an increasing attention as one of the core standards in the field of information security and thus IT security. Whilst standards like SAML (Security Assertion Markup Language) address the problem of authentication, XACML is about authorization – the more complex threat. XACML allows the definition and exchange of authorization policies in a heterogeneous environment. Whether it is about cloud security and controlling the authorization policies of cloud services or about SOA security for internal applications: XACML supports the authorization management in such use cases.

However, there is no such thing as a free lunch: XACML not only tools like XML/SOA Security Gateways which support that standard or cloud services with XACML support. There are two other important aspects:

  • XACML in fact means a shift from a more static security approach like with ACLs (Access Control Lists) towards a dynamic approach, based on policies which are applied at runtime. These dynamic security concepts are more difficult to understand, to recertify, to audit and analyze in their real-world implications. Thus, the use of XACML requires not only the right tools but well-thought concepts for policy creation and management.
  • XACML is just a foundation to express policies. Within a use case, policy concepts have to be defined. Over time, there should be higher level standards or defined use cases building on XACML and focusing on a standardization of the content of these policies.

Anyway, XACML is very useful. One of the most interesting areas for XACML is SOA Security. Currently, many SOA-based applications still lack a valid concept for authorization. Authorization still frequently is built into these applications. XACML can provide the policies to externalize the authorization management and thus add flexibility to SOA-based applications.

Overall, it is – from my perspective – definitely worth to spend some time exploiting the potentials for XACML to improve the security of systems and applications. There are many areas where XACML can be used successfully today. However, like with any emerging technology, there will be a lot of improvements in the managing and consuming applications (and, hopefully, around the standards ore use cases building on XACML) over the next few years. Thus the step to XACML has to be considered carefully. The good thing is: It is about standards, thus the risk of lock-in isn’t that big.

We will talk more on depth in an upcoming webinar. Register for free!


Another approach to IRM

14.10.2009 by Martin Kuppinger

Last week I had a discussion with Seclore, a software company based in Mumbai, India. They are focusing on the area of Information Rights Management (IRM), one of my favourite research areas. I’m interested in this topic mainly for two reasons:

  1. Information Rights Management is one of the IT topics with the closest relation to the core business topic of Information Security/Protection (including Intellectual Property Rights, IPRs).
  2. Information Rights Management is the approach which allows the ongoing protection of information at rest, in move and in use – compared to many other approaches which cover only one of these phases.

Most solutions in that market are based on plug-ins into existing applications which enforce the IRM policies. The policies are managed centrally, information (documents) are protected by encryption.

Seclore’s approach is different in that they not mandatorily rely on such plug-ins but mainly act “below” the application. The client component (which is required to access protected, e.g. encrypted, documents) tries to analyze the activities off the application like access to the file system. One impact of that approach is that a document might be opened with different applications supporting the specific document format.

Even while I personally believe that implementing IRM functionality within the applications (the more common approach of vendors like Microsoft, Adobe and Oracle) allows a tighter control about the actions of a user and application on a document, the Seclore approach has some appeal. It is lightweight and works well today with different applications and in different environments, beyond the enterprise. As long as there is no common standard for the interactions of applications (the policy enforcement points) and the IRM backend systems across different vendors, this is a workaround. And once there is such a standard, Seclore is very likely to support it. Thus, not only looking at the big vendors but as well at Seclore makes sense in these early days of Information Rights Management.


Integration for the cloud

07.10.2009 by Martin Kuppinger

On Monday I’ve met with Matthieu Hug from RunMyProcess in Paris, an interesting start-up company in the “cloud”. Their focus is pretty easy: Integrate the cloud – with what you have internally and with other cloud services. At CeBIT 2008 I’ve done a presentation about “SaaS” and related topics (we didn’t use the term “cloud” at that point of time). One of the three major issues I’ve discussed as threats in that area (and would mention nowadays as cloud threats) is integration. How do you integrate external cloud services with other external services or internal applications? Some of these services provide a set of web service interfaces. But even then, integration is a tough work.

RunMyProcess now provides an external “cloud” service to do that integration. They provide pre-configured web services of a series of (external) cloud service providers, including Salesforce.com, SAP BusinessByDesign, and GoogleApps. And they allow to define processes which include one or more of these products. That allows to build integration between such services and existing internal applications. It as well allows to enhance cloud based services like GoogleApps. Matthieu told me that some of his customers are adding workflows to GoogleApps to replace Lotus Notes (even while I’d recommend the customer to consider LotusLive as an option in that case…). And there are some companies starting to create added-value services by integrating and enhancing cloud services, creating sort of “industry clouds” or “community clouds”.

I like the approach of providing an integration platform in that way. It doesn’t solve every problem (and more complex platforms built on top of classical application servers might provide some more functionality) but it is an answer to one of the biggest threats in the cloud. Thus it is definitely worth to have a look at that solution. And it is just another example of the amount of creativity unveiled by the cloud evolution.

If you want to learn more about the cloud, you definitely should attend Cloud 09, Dec 2nd-4th, Munich. And you should always have a look at the Kuppinger Cole webinars. We do webinars on cloud topics frequently – and there are many recordings of cloud webinars available.


Posted in Cloud | 1 comment

GRC – a heavily segmented market

01.10.2009 by Martin Kuppinger

GRC – Governance, Risk Management, Compliance. A typical buzzword, but well established right now. However, the problem of all emerging markets associated with a buzzword arises here as well: There are many different vendors with different types of offerings, all claiming to solve the GRC problem. But: The GRC problem has many facets and is (beyond “we have to manage risk, we have to be compliant”) largely undefined. We’ll publish a report these days on a GRC reference architecture followed by, probably in early November, a market segmentation report, placing vendors in one or more appropriate segments. Like every valid and successful emerging market, GRC will move from a large set of different solutions towards a market with some well defined segments of vendors.

There are the so called “Enterprise GRC” vendors like Mega, OpenPages, or Bwise. But even between these there are significant differences. There are vendors working more at the level of CCM (Continuous Controls Monitoring), including companies like Approva. There are IAM-GRC vendors like Aveksa, BHOLD, Engiweb, Sailpoint, and several others. There are IAM solutions with added GRC capabilities – in the meanthime most of them. There is GRC support in BSM (Business Service Management) applications. And, and, and… I don’t want to unveil to much from the upcoming reports which you will find at our website but like to focus on another aspect:

Which GRC approach to choose?

First of all, I believe that we have to use the potential of GRC for better interfacing Business and IT. There are business controls, there are IT controls. These have to be mapped. Thus, we should end with solutions which support as well the business as the IT requirements. That will never ever be a single solution, but a combination of some. High level controls and dashboards, CCM approaches and more specific solutions for different groups of IT controls. It should as well be an approach which isn’t only “detective” or, more correct, “reactive” but finds the balance between proactive/preventive and reactive/detective.

The big picture is relatively easy to describe, like we have done in our reference architecture.

The way towards that is much more difficult. There are many influencing factors like the industry and size of the organization, the current organizational structure (especially around the responsibility for GRC issues), the process maturity of the organization, the maturity of IT management approaches, and so on. Thus there can be different (and more than one) starting points. But in any case, there should be a well agreed (but coarsely described) “big picture”, as the guideline for building a GRC roadmap.

I personally believe that three factors are most important:

  • Providing quick wins
  • Providing a business view which, from the beginning, starts in integrating with IT – only manual controls are’t sufficient, it is always about the appropriate mix of manual and automated controls
  • Closing the loop – don’t focus only on the reactive part (like with pure “access certification”) but start acting on the results, for example by integrating provisioning to fix the detected problems

These are some of the most important criteria to choose solutions in the GRC space.

Have a look at our event website for upcoming events and webinars around GRC.

And, for sure, don’t hesitate to ask for our advice on building your GRC “big picture”.


Posted in GRC | 1 comment
Services
© 2014 Martin Kuppinger, KuppingerCole