Facebook – they won’t understand

27.07.2010 by Martin Kuppinger

Today I opened my Facebook which I use actively since yesterday. When g0ing to my settings, the system informed me about changed privacy settings. What it then recommended was ridiculous: All my very tight settings should be opened up. Instead of sharing information only with my friends, the system suggested that I should share a lot of information with everyone and other, sometimes sensitive information (religion, political opinions) with friends of my friends. I had to manually change back everything to “old settings” which at least was an option I could use. However, from my perspective it is fully inacceptable from a privacy perspective to suggest such changes. If someone has opted for tight settings, this approach just shows that Facebook still hasn’t understood anything.

Besides this, the options for managing “authorizations” or privacy settings, e.g. controlling who is allowed to see what are primitive. I can share everything with my friends. But in many cases I want to share some informati0n only with some of my friends. I can use lists, but I for example can’t use these lists as sort of “groups for ACLs (Access Control Lists)”. At list I didn’t manage to find out how until now. But given that I have friends from business and from my private life, it is very obvious that I won’t share everything with everyone, isn’t it?

Again, like pointed out here and here, there is no reason not to construct social networks secure and with strong privacy settings. For sure it is hard to do it afterwards, once you have a bad security architecture in place. But technically seen, it is feasible – and it is relatively easy. But it requires understanding the needs for privacy (which become an inhibitor to the market for Facebook at least in some countries these days) – and you have to do that.

Why am I using Facebook anyway? Too many people are using it and many said that it is a better way to stay in touch with contacts than the other social networks like Xing or LinkedIn. And, by the way: These other networks are as well not the godfathers or inventors of privacy… I don’t expect Facebook to ever understand privacy and act accordingly. Thus I’ll keep an eye on what I publish there and what I don’t publish and I’ll keep my privacy settings very rigid. For sure I could use more than one Facebook account. But that would be harder to manage and a pain for the ones which are “friends” in private and business life.

Just a side note: Interestingly many startups have significant lacks in their overall software architecture and struggle with things like scalability and adding new features. And even more struggle with increasing security requirements. One reason is the missing understanding for security (see link above). The other is that many startups have CTOs which are pretty inexperienced – interestingly the ones where the founders (and amongst them the CTO) is doing a startup the second or third time perform much better because they have learned many lessons before. There are – like always – exceptions from that rule, e.g. startups with young CTOs doing a very good job. But these are the exceptions. You could bet on what my rating for Facebook is from that perspective…

By the way: If anyone knows how to control all access to the content in Facebook based on my lists of friends, let me know…


Cloud, Automation, Industrialization

21.07.2010 by Martin Kuppinger

Cloud Computing is still a hot topic. And there are still many different definitions out there. I personally tend to differentiate between two terms:

  • Cloud: An IT environment to product IT services.
  • Cloud Computing: Making use of these services – procurement, orchestration, management,…

Thus the internal IT can be understood as one of many clouds, there might even be multiple internal clouds. But we don’t have to care that much about internal, external, public, private, hybrid,… The prerequisite for an IT environment to be understood as a cloud is the service orientation, e.g. the production of well-described services. That might be done in a more or less scalable way – but it is about services.

Read the rest of this entry »


Quest and Völcker – and what about the customers?

13.07.2010 by Martin Kuppinger

Yesterday, Quest announced the acquisition of Völcker Informatik. I’ve blogged about the impact on the IAM (and especially the Identity Provisioning) market yesterday. In this post, I’ll focus on the impact on existing customers. Acquisitions are always a situation where FUD arises – fear, uncertainty, doubt. There are many examples of acquisitions where customers were on the looser’s side afterwards, because their products of choice were (or are) supported only for a limited time before they had to migrate to another product. I won’t bash on vendors here who have acted like that – you all probably know some examples for that situation.

When looking at Völcker customers, there shouldn’t be much FUD. Völcker will continue it’s development in Germany and the leading people will stay on board. Even more, Völcker will have significantly bigger resources available – and given that Völcker is very innovative and has also a strong understanding of IT Service Management, the customers should benefit from that. Beyond that, Völcker as part of Quest is a global player instead of a Hidden Gem which is “world-known in Germany” only. With other words: There are many opportunities and I don’t see much risks. For sure an integration process might slow down things a little. But Quest is experienced enough in integrating acquisitions to mitigate these risks.

On the other side, there are the Quest ARS (Active Roles Server) customers. What is in for them? Quest ARS started as a tool for better, role-based management of Active Directory environments. Today it supports also some other systems. However, it is still Active Directory-centric. Quest has stated that both tools, Völcker ActiveEntry and Quest ARS, will play a vital role in their further strategy, with strong integration between both tools. Thus, Quest ARS remains a strong solution for Active Directory environments. And if it is about heterogeneous environments, ActiveEntry comes into play. It will be interesting to see how much Quest will invest in ARS support for heterogeneous systems. That probably is a slight risk for customers. But overall, the risk is relatively low.

Chances are good that this turns out to be one of the acquisitions where customers of both parties can benefit in the future. The reason is simple: There isn’t that much overlap between the portfolios. And, from the KuppingerCole perspective, there is much more potential for synergies well beyond IAM and Identity Provisioning.

By the way: There are several reports available at www.kuppingercole.com/reports – on Quest products as well as Völcker products, and there is the Hidden Gem report which covers Völcker as the not-so-hidden-anymore vendor.


The first Hidden Gem isn’t hidden anymore!

13.07.2010 by Martin Kuppinger

Some days ago, we’ve published our report on Hidden Gems 2010 - vendors which are innovative but not that well known, at least not on a worldwide basis. We’ve included 25 vendors. Right now, only 24 of them are hidden. Völcker Informatik, one of the Hidden Gems, has been acquired by Quest Software. There is a good reason for that: Völcker is, from the Quest perspective, a Gem which might help them make shine (even) more than before. And not only from the Völcker perspective.

For sure I like it when a Hidden Gem becomes “more visible”, because it proves our rating of these vendors. So I’m looking forward to see who is next.


Posted in Cloud, IAM market, Security | Comments Off

Quest acquires Voelcker – the IAM market will change…

12.07.2010 by Martin Kuppinger

Today, Quest announced that they will acquire the German Völcker Informatik AG with its ActiveEntry product, a leading-edge identity provisioning solutions with some integrated Access Governance capabilities. From my perspective, that is a very interesting acquisition, which brings Quest into a leading position in the overall IAM market. Until now, Quest has been a provider of several point solutions around IAM issues. They had some provisioning capabilities in their ActiveRoles Server before – but it hasn’t been the technical leading-edge product but more an add-on for some provisioning for Active Directory and a little beyond.

Right now, they are one of the vendors in the market which have solutions in most of the areas of IAM. They have one of the (from a technology perspective) definitely leading-edge products in the markets for identity provisioning. And they have a lot of complementary solutions. Beyond that, ActiveEntry fits very well into the Quest portfolio by supporting Active Directory environments at a high level but going well beyond that. Thus, it is sort of the perfect fit.

Quest right now is a full competitor of the big and established ones in the market like Oracle, IBM, Novell, and the others. It is in an interesting competitive position regarding Microsoft, Omada and related vendors. And, if you look at the number of people working around IAM, Quest is also from that perspective one of the vendors with the biggest potential in the market. With other words: This acquisition will heavily affect the IAM market and Quest will be one of the vendors to really take into account now.

There are several reports on Quest and Völcker from KuppingerCole available at www.kuppingercole.com/reports. Have a look at them (or ask us for advice…).


Do we still have to care about directory services?

09.07.2010 by Martin Kuppinger

It became pretty quiet around directory services during the last years. When I remember the discussions back some 10, 15 or 20 years around NDS versus LAN Manager (and the underlying domain approach) or Active Directory when it came to market, and even the discussions which came up in the early days of OpenLDAP, it is pretty quiet nowadays. Are all the problems solved? Are the right directories in place? Are the best solutions chosen when something changes?

When talking with end user organizations it becomes obvious that we are far away from that state. There are implementations of different directories, and most of them work well for their specific use case. But once it comes to optimization, the situation changes. What to put in the Active Directory, what not? How to optimize the way applications are dealing with directories? How to best build a corporate directory or a meta directory (the directory as data store, not the meta directory service as technology for synchronization!)? How to interface directories for specific use cases and how to best retrieve information?

There are many aspects to discuss and to understand to end up with an optimized “directory infrastructure”. First of all, it is important to understand which directories you have and how they are used – usually there are far more directories out there than you’d expect. And I’m not only talking about the Active Directory, eDirectory and all the LDAP servers, but as well about “de facto” directories in the form of tables in databases and so on. I’m talking about anything which acts as a directory. That includes the application directories, which might be hundreds of small directories. And they sometimes contain sensitive information like privacy-relevant data. Besides this, they frequently have somewhat redundant data. Based on this analysis, you can drill down and identify which attributes have to flow between which directories in which use cases.

The latter is more about really optimizing your provisioning. The analysis is, on the other hand, as well a good foundation for optimizing your directory infrastructure. Where can you avoid redundancy?

Based on such an overview, you can think about some other aspects:

  • Which central directories do you need for which use cases?
  • How to optimize application access on directories?
  • Where do you need specific technology for these directories beyond standard LDAP?

There is always a need for some more or less central directories. The Active Directory or eDirectory are examples, used for the primary authentication of internal users and for many infrastructure services – but they can’t do anything. There are Corporate Directories for centralized access to corporate information. There are more technical meta directories as the “source of truth” about distributed information.

We have to think about optimizing the application directories. One or few centralized directories together with Virtual Directory Services which are offered for example by Radiant Logic, Oracle, and Symlabs are an interesting option do build such a centralized yet flexible infrastructure, with the Virtual Directory Service as interface layer.

And we have to look at specific use cases where we need specialized technology. There are some innovative vendors out there. UnboundID for high scalable environments, where others like Oracle, Novell, Siemens, and so on are active as well. eNitiatives with their ViewDS services for strong querying capabilities and the ability to easily build interfaces in a “yellow page” style to these directories.

My experience is, that there is still a lot of need to think about directory services – and there is a lot room for improvement in most IT environments. What is your view on that topic?


BAM brought to reality

02.07.2010 by Martin Kuppinger

Do you remember the term BAM? BAM is an acronym for Business Activity Monitoring. It was a hype topic in the early 2000’s. And then we didn’t hear that much anymore about this topic. Yes, there are several vendors out there, providing different types of solutions. And like always, there are several vendors who claim to be the leaders in the category of BAM.

When BAM became a hot topic some 10 years ago, the implementations were nothing else than a little advanced analytics. That was, at that point of time, far away from my expectations which were around intelligent, automated, real-time and ex-post analysis of relevant activities in business systems and the identification of critical changes which require intervention. For sure automated reactions as well as alerting should be part of this.

The term BAM came to my attention again when talking with MetricStream recently. MetricStream is one of the leading-edge vendors in the GRC market. They are one of the “Enterprise GRC” vendors (Business GRC would be the better term). But in contrast to many others, they allow for a tight integration with IT systems and IT controls. Based on that, they are able to use automated controls of virtually any type and map this into their system. That in fact allows to integrate what I had expected from BAM years before with a holistic GRC approach. By the way: MetricStream has a pretty high rank on my list of GRC vendors…

When looking at the BAM market I have to admit that there has been evolution since the early years of BAM. There is much more automation than pure analytics, there are several interesting solutions out there. However, MetricStream is somewhat unique with enabling this (without talking about BAM) in the context of Business GRC and thus allowing to add this as a generic approach into what every organization has to do today: Building a GRC infrastructure, with manual and automated controls – where automated controls should provide what BAM has been promising.

I assume that several of you have another opinion – so I’m looking forward for your comments.


Posted in CIO agenda, GRC | Comments Off
Services
© 2014 Martin Kuppinger, KuppingerCole