Smart Data: The better Big Data – using the Open API Economy concepts to better deal with your data

21.05.2012 by Martin Kuppinger

IT vendors these days are making a lot of noise about “Big Data”. That comes as no surprise, since Big Data allows selling masses of expensive hardware, software, and services. But does it really make that much sense for the customer?

The sales pitch for Big Data is that companies can better do business based on that approach. They can do better marketing based on analyzing more data about their customers. They might provide better security services on analyzing more data. They might need it to deal with machine-generated data in the connected vehicle.

However: better marketing is not about still failing to understand the intention of the customer based on more data (I just recently blogged about that). For marketing it is about changing the fundamental principles and accepting that there is no way of reliably predicting the current intention of a customer based on historical data.

For security, analyzing more data and understanding the patterns which might become visible when you look at data from your endpoint security solutions, your firewalls, and some log files (and not only one portion of that data) is a clear argument. However, even here I don’t think that we necessarily have to put all that data into one place.

And if we look at machine-generated data like the connected car, there is no single application which needs to have access to all that data. There is a lot of data. There are many applications which might require some of that data. But there is no need for the single Big Data store. The connected vehicle (and some other elements of the upcoming “smart world”) are topics I’ll touch more in depth in upcoming reports and blog posts.

I fully agree that the volume of data is increasing. But I doubt that we really need to deal with an exponential growth beyond what is covered by the growth of computing capacity described in Moore’s law.

There’s also this: building bigger data stores inevitably leads to compliance issues. One of the favored targets of today’s auditors is SAP Business Warehouse. A lot of data ends up in there and it is hard to understand and control with which other data it is combined and where it ultimately goes. So betting on Big Data may also mean provoking new audit findings.

My perspective is that we should focus much more on getting smarter in dealing with data. If we know what we want to do with data than we can request the data we need from the existing data sources. We can even request pre-processed data to become smarter. A foundation for that is what my colleague Craig Burton recently described in his report on The Open API Economy.

The Open API Economy is about providing simple (or smart) APIs, like RESTful APIs. They are simple to use. If we add such APIs to our existing data we can consume different APIs and then extract the data we really need for analytics. This means that we don’t need to build the big, fat data stores. We extract what we need. The API might provide pre-processed data. We could also import data if we need it more frequently.

Yes, there are aspects which we have to keep in mind, like latency issues and the amount of data which has to be moved across the wire. But overall there are many situations where being smart will be a better approach than being big.

I don’t say that there is not a need for some bigger data in some cases. But I doubt that there is any value by itself in Big Data. It is just a marketing buzzword. And it’s time to start thinking about the real value of Big Data business cases and smarter approaches. Smart data and picking up concepts like The Open API Economy can help saving money and getting more results for less investment.

I would also say that programmatic access to data also allows the ability for an entity to consider “contextual” or “salient” data at the time of an event. This design also allows a given entity to ignore any incoming data that isn’t salient and would therefore preempt the need for “Big Data” stores. On the other hand, there may be instances where an entity can ignore the event at a given time, but needs to be able to go back in time if needed to see what happened.

All this lead to a common theme throughout our KuppingerCole philosophy, “freedom of choice”. If the Big Data approach works, then by all means, use it. If you don’t need it, ignore it. It’s not about whether Big Data is wrong. It is about the enterprise or individual having a choice and doing what is best for him. Be smart and think about Smart Data!


The Future of IT Organizations – why IT needs a marketing department

16.05.2012 by Martin Kuppinger

Some weeks ago we published a report called “The Future of IT Organizations“. This report talks about how to restructure IT Organizations, following the basic structure we propose for IT in the KuppingerCole IT Paradigm. That paradigm is first described in the KuppingerCole Scenario “Understanding IT Service and Security Management”. From our perspective, IT organizations have to change fundamentally in order to redefine the way we do IT to better deal with challenges like Cloud Computing.

When looking at the future of IT, there is one area which I find particularly interesting. Some of this came to my mind when reading one of the blog posts of Chuck Hollis, Global Marketing CTO of EMC Corporation. The blog post is titled “Why IT Groups will invest in Marketing” and is focused on the need for marketing.

What I liked in that post was the distinction of inbound and outbound marketing for IT – a distinction I picked up and I have to recognize Chuck for. I then aligned it with the KuppingerCole IT model, adding another element which is “product management”.

The IT of the Future is demand-driven. Today’s IT should be as well but reality frequently shows a different picture. Providing the services business really needs is very much about that demand-driven IT. That requires understanding the customers. And that is where the topics of Outbound and Inbound Marketing come into play.

Outbound Marketing is the more common approach. We all are familiar with this in everyday life when getting confronted with advertisements and other types of market communication from vendors. For IT Organizations there are two main aspects for Outbound Marketing:

  • Positioning IT as the one and only source of the services business requires
  • Selling the IT services which are produced on-premise as part of these business services

The first part is of high importance because IT should remain in control (or get back control) of all the IT services which are either produced on-premise or procured from the Cloud. Without centralized control organizations will, over time, struggle massively with their IT services. Furthermore, there is no way to get a grip on IT cost without such centralized control

The other part of outbound marketing is mandatory as well. The ability to sell the services which are produced on-premise is important. On-premise IT is in competition with cloud services. Thus it is not only about producing the “better” IT services; it is also about selling them. IT Organizations have to change their attitude from being reactive to becoming a proactive provider of services to the business organization.

But there is the other side of the coin as well. That is about Inbound Marketing. Inbound Marketing is even more about the customer’s need – with the customer being the business part of your organization. Inbound Marketing is (amongst other things) about

  • The specific needs of your customer
  • Identifying the buyers on the customer side (which even in large organizations frequently is not as clear as it should be when it comes to budget discussions)
  • Understanding how the customer wants to consume

It is about understanding the customer and driving the IT Organization in a way that the right services are offered. In fact this is about a strategic and standardized approach to providing exactly the services business needs.

From an organizational perspective, IT has to fundamentally change its interaction with business. It is about bringing the demand-supply principle to life, which has been discussed for quite a while. The need to do that is greater than ever.

What do IT organizations need at that level?

  • They need to identify the “customer’s customers”, e.g. the persons within the business organization who are requesting the business services. That might require changes in the business organization as well, given that the business needs contact points. Notably, these persons might be less technical than today, given that the ideal of the future IT organization is to provide business services the way business needs them.
  • They need, as mentioned earlier, IT Marketing, i.e. persons caring for the outbound as well as the inbound marketing.
  • They need “product managers”. If you look at large and successful vendors, product management always plays an important role. They are the link between the customer and software development. They have to translate between customer requirements and development. Sort of the same role applies to them here: They work closely with IT Marketing and the customer’s customers on one side and the Service Management within the IT Service & Security Management Layer to map these.

Simply said: IT Organizations in their changing role as suppliers to the demand of business should act like successful software organizations – with the difference that they don’t need that level of sales but more the marketing and product management parts.


Intention and Attention – how Life Management Platforms can improve Marketing

15.05.2012 by Martin Kuppinger

Life Management Platforms will be among the biggest things in IT within the next ten years. They are different from “Personal Data Stores” in the sense of adding what we call “apps” to the data stores and being able to work with different personal data stores. So they allow to securely working with personal data by using such apps which consume but not unveil that data – in contrast to a data store which just could provide or allow access to personal data. They thus are more active and will allow every one of us to deal with his personal data while enforcing privacy and security. Regarding “Personal Clouds”, that might be or become Life Management Platforms. However I struggle with that term given that it is used for so many different things. I thus prefer to avoid it. Both today’s personal data stores and personal clouds have a clear potential to evolve towards Life Management Platforms – let’s wait and see. I’ve recently written a report on Life Management Platforms, describing the basic concepts and looking at several aspects like business cases. This report is available for free.

The other big thing around this topic is the book “The Intention Economy”, written by Doc Searls. It is a must read and even while it mainly focuses on the relation between vendors and customers, there is a big overlap between what Doc has written there and what we at KuppingerCole expect to happen with Life Management Platforms.

Doc’s basic point is that the Intention Economy will change the relationship between vendors and customers. I like these two quotes:

„Relationships between customers and vendors will be voluntary and genuine, with loyalty anchored in mutual respect and concern, rather than coercion. So rather than „targeting“, „capturing“, „acquiring“, „managing“, „locking in“, and „owning“ customers, as if they were slaves or cattle, vendors will earn the respect of customers who are now free to bring far more to the market‘s table than the old vendor-based systems ever contemplated, much less allowed.“

„Likewise, rather than guessing what might get the attention of customers – or what might „drive“ them like cattle – vendors will respond to the actual intention of customers. Once customers‘ expressions of intent become abundant and clear, the range of economic interplay between supply and demand will widen, and its sum will increase. The result we will call the Intention Economy.“

„This new economy will outperform the Attention Economy that has shaped marketing and sales since the dawn of advertising.“

Yesterday I did a presentation at an event organized by doubleSlash, a German Consulting and Software Company focused on Sales and Marketing. The so called “slashTalk” had the title “After the Social Media Bang” and focused on what companies will have to do now. There were several marketing executives and experts from different companies in the room.

Before my presentation on Life Management Platforms there was another presentation which I found extremely interesting. Björn Eichstädt, founder and managing partner at Storymaker, a company which originally started as a PR agency, talked about his view on attention and why today’s marketing fails (in most cases). Björn has a degree in neurobiology, so he is far more than just a PR guy. He talked about “attention” and the small period of time within which you can catch someone’s attention. But it could be done, as with what today’s social networks provide. However, it isn’t easy today. On the other hand, providing what fits to the current target of attention is much more promising than trying to change the attention, like traditional marketing is doing.

Taking this view, the one of Doc Searls, and the idea of Life Management Platforms the way we at KuppingerCole have it in mind shows that this is where things become really interesting: A Life Management Platforms allows expressing your Intention. The Intention is nothing other than a vital part of where your current Attention is focused. In other words: Knowing the Intention is about knowing at least an important part of the current Attention, which is much better than trying to change the Attention. Furthermore, Life Management Platforms could provide more information about the current Attention in real-time, but in a controlled way – controlled by the individual. That allows getting even more targeted information and makes this concept extremely attractive for everybody – the vendors and the individuals.

Imagine a world in which you can allow others to provide you exactly that piece of information you are interested in. Let’s give an example:

Your profile on a social network might provide the information that you just arrived at the airport in a specific city. Some vendors might track this information and send you welcome messages, pointing to their local assistance, or other offerings. That could be done based on what today’s social networks provide. And this is nice if you receive only one message or offers which really suit your needs. But if you receive 20 messages from companies which detected that your attention might be on that, it is just annoying.

In a Life Management Platform you can control whom to inform about such a “social” event. That can be specific companies or industries. They know that someone arrived at the airport and needs some specific information, about directions, the next ATM, or the next public WLAN hotspot – or whatever else. The system provides that information to you and you use the service. This obviously is the better approach.

You might ask how this differs from typing “MUC ATM map” or “IAD WIFI” into a search engine? The fundamental difference is that the Life Management Platform can express your intention once it has learned about it – and you might have the same intention every time you arrive at an airport. It acts for you and consumes your preferences like for example the personal data about the mobile phone providers you have contracts with and you prefer for roaming or the banks you have accounts at to find the ATMs without additional fees or even without fees. Entering all that information into a search engine is annoying. And selecting the results in mind is annoying as well. So there is an obvious value even in that simple use case. And for sure you might not want to give all that information about your bank accounts away – you might want something (the app in Life Management Platforms) to act upon without unveiling that information. You might want minimal disclosure.

Life Management Platforms will enable that, amongst many other things. Given that they are a vehicle to fundamentally change the way marketing is done, moving from changing the attention to using attention and intention in a controlled and targeted way. Thus, everyone responsible for marketing should start looking at the ideas around Life Management Platforms, the Intention Economy, and Björn’s understanding of what Attention really is about. It is a simple way to get much better in Marketing and save money.


Entitlement Management – has it really been an academic exercise?

10.05.2012 by Martin Kuppinger

Recently I read a blog post from my appreciated and well known analyst colleague Kevin Kampman at Gartner Group talking about entitlement management. That post had some points which made me wonder. I’ll pick some of the quotes:

  1. “One of access control’s biggest challenges is that it has often been an academic exercise. Maybe we can move the discussion forward by thinking about what is needed, not just what is possible.”
  2.  “For any object, a set of conditions should be met to provide access such as time, attribute, role, etc. it seems we need a more flexible way to characterize all of the conditions that need to be met for access to be granted. Not attributes about the object itself but what you need to bring to the party to play.”
  3.  “A lot of the focus in the *-BAC world is what attributes IT can provide to represent these conditions. It might make more sense to describe the conditions needed to characterize access.”

There are more, but these are some which I feel the need to comment on. Let’s start with the first one. I would agree that role management in its early days, when it first became mainstream, sometimes really was too much of an academic exercise. But if I look at the reality of projects today, that’s no longer the case. Role management is well understood and there is a lot of knowledge available on how to successfully implement role management in practice.

Going further to what dominates the evolution of Entitlement Management today, we have to look at Dynamic Authorization Management. Here neither the evolution of XACML as the key standard nor of claims as a related and somewhat overlapping approach is driven by theorists. Furthermore, most of the products in the Dynamic Authorization Management market like the ones of CA Technologies, CrossIdeas, IBM, or Oracle are derived from projects and the customer needs therein. They were built for practitioners from the very beginning. Even while they might not be perfect yet, they definitely are not the result of academic exercises. Consider also that Axiomatics, which started with strong focus on the XACML standard (and is one of the most active supporters of defining the XACML standard) is strongly led by customer feedback and experience from real world implementation projects.

My perspective is that the biggest challenge for Entitlement Management today is the organizational and process maturity of the customers, when it comes to defining business roles and business rules and when it concerns identifying the players in the business organization which have to participate. IT has become better in supporting IT business/alignment but still has some work to do on that especially with simple interfaces for defining business rules in Dynamic Authorization Management products and further improving the business interface of Access Governance tools. But this again is not the result of being too academic.

Regarding the second aspect: Despite the criticism I sometimes have articulated regarding XACML as being a standard which is too complex for the end users (which I still believe is true), the underlying concept of implementing business rules is simple. Yes, it is annoying to write XACML, but that is true for any type of XML. Still, any business user can easily define the rules in a structure that can be used by XACML – this is straightforward and simple to understand.

And in that concept (and other approaches for Dynamic Authorization Management) it is very simple to express the full variety of rules, from more technical ones to pure business rules using business-provided constraints or competencies. This is focused on objects – but the objects can again be anything, from a piece of information (like a document) or its representation (like a share) to business activities within business processes. This is all there – so it is fairly simple to use it. And the same concepts can be used for all types of use cases. You can rely on a subset of the same set of policies for versatile, context-based authentication and authorization (which again provides attributes for other decisions) and for the internal authorization in a business application which needs to enforce complex business rules such as for the approval of new insurance contracts.

Having said this, we arrive at the third quote. Don’t we describe the conditions today? I’d say we can do it and we frequently do it, not only within Dynamic Authorization Management but also in more advanced concepts around Access Governance . These concepts go beyond roles today and can use concepts of constraints or competencies. Some implementations are tightly coupled with business activities and business processes.

By the way: Introducing a term of *-BAC doesn’t seem to provide much value to the customer. We have RBAC (which, in the NIST approach, is somewhat academic – but not in real world). We have used the term ABAC (Attribute Based Access Control) sometimes in the industry, with attributes describing any attribute which can be used within policies, including roles as a specific type of attribute. So ABAC covers everything and *-BAC only leads to babel.

Simply said: My view on the state of Entitlement Management, Access Governance, and Dynamic Authorization Management is fundamentally different from the one in that other blog post mentioned above. It think that the industry is much more mature. And not too academic.

 


Dynamic Authorization Management Best Practices

09.05.2012 by Martin Kuppinger

Due to a last minute speaker change I had to prepare a short presentation on „Dynamic Authorization Management – Best Practices from our Advisory“ for EIC 2012. When we found a replacement for the speaker, I didn’t give that presentation. However I will do a webinar on that soon and I want to provide some of the content here, as sort of an appetizer.

Dynamic Authorization Management is about dynamically deciding to approve or not authorization requests provided by services (like applications) based on policies and attributes (roles, application used, context, whatever,…). It includes policy definition and management, the access to sources for these attributes like directory servers, databases, ERP systems, and systems for context- and risk-based authentication and authorization. A key standard is XACML. The role of Dynamic Authorization Management within overall IAM (Identity and Access Management) is defined in the KuppingerCole Scenario Understanding Identity and Access Management.

A key success factor in Dynamic Authorization Management is to bring participants from all the different siloes involved to the table. You need people from the business organization, you need application architects and developers, you need IT Security, and you need others. This is a complex challenge.

Another key success factor is to set the right scope and to start small enough to be successful. The design has to cover coarse-grain and fine-grain authorization. It has to look at all types of applications and users. And thinking about the “Identity Explosion”, that means that it has to cover authorization not only for employees, but for many other types of users.

When planning the environment, the positioning of the Policy Enforcement Point (PEP) and Policy Decision Point ( PDP) (more information on XACML, PEPs, and PDPs here) is one of the challenges. Vendors provide a lot of flexibility – and you need to understand the different options to meet the performance and scalability requirements of your environment. This becomes increasingly complicated in cloud environments given that it is hard to run a large number of queries across long distances in an efficient way. So approaches like providing access controls statically to systems might come into play. Clearly, putting a lot of thought into the concepts is a key success factor, especially given that Dynamic Authorization Management has to cover more or less all of your distributed environment.

Acceptance by developers is directly related to simplicity. Keeping things simple for developers is also one of the key success factors. You should start thinking about applying the paradigms of the Open API Economy here.

The same is true for policy definition. The good thing is that the way policies are described in XACML from a conceptual perspective (so without the XML stuff around) is pretty straightforward, simple to understand, and powerful. Nevertheless you have to educate your business users in expressing their business policies and translate this for the IT level. And you shouldn’t underestimate the complexity of auditing and analyzing policies in a dynamic environment.

However, when putting sufficient work into the concepts, you can design a Dynamic Authorization Management environment today which is future-proof. You should also do it because that will help you to become much more efficient in the management of Information Security and much more agile in fulfilling today’s and tomorrow’s audit requirements.


Bring Your Own Identity? Yes. And No.

08.05.2012 by Martin Kuppinger

Recently I read a blog post  by Nick Crown, Director of Product Marketing at UnboundID. He talked about “Bring Your Own Identity” which he thinks is more groundbreaking and disruptive than BYOD (Bring Your Own Device). I would say yes, there is a value in BYOI, but:

-          this is definitely not as groundbreaking and disruptive as BYOD

-          this is only a small piece in a much larger puzzle and it definitely will not end with a two-tiered identity infrastructure as proposed in Nick Crown’s blog post

-          there’s definitely no need to introduce yet another marketing buzzword and acronym like BYOI

Certainly, just  like every other vendor’s blog, posts like the one by Nick Crown are driven by the wish to position the company as “the primary vendor” in the specific area. But the question from a customer perspective (and from an analyst perspective) is: Does it really make sense?

So I want to focus on the three points above:

BYOD is one of the trends which are fundamentally changing the way we need to do IT, as well from the system management as from the information security perspective. It is about moving away from device-centric security to information-centric security approaches. That is a massive change, much bigger than any around identities. BYOD is directly related to the big changes we commonly call Mobile Computing and Consumerization of IT. And it relates also to the “Deperimeterization of IT”. BYOI (when defined as the user bringing its own identity) is, of course, related to big trends such as Social Computing. But it isn’t as new as some people claim. Federation as one approach to deal with this has been out for quite a while and is still evolving – look at OpenID Connect, recently awarded a European Identity Award by KuppingerCole for being the best new standard.

BYOI is much smaller than BYOD in its impact because of the second point mentioned above, something we at KuppingerCole have been talking and writing about for a pretty long time now. The reality is that there will be multiple identity providers. This is about things like trust frameworks, about concepts like claims, and about the need to become flexible enough in the days of Identity Explosion. It is about gaining the ability to deal with multiple pieces of information provided by different providers, instead of one provider or two tiers of providers. There will be many different types of Identity Providers – and they are already here, in fact. What changes is the ability to deal with these providers. That is about federation, about claims, about concepts like IDMAAS (Identity Management as a Service) the way Kim Cameron has presented it in his keynote at EIC 2012. However, it is not that much about directory services or technical synchronization. The fact that someone brings his own identity is just a little piece. And more important than accepting a BYOI ID is the ability to accept many different providers and to convert them into other IDs once the type of transaction and interaction with the individual requires such a conversion.

I’d also recommend you have a look at our report “Life Management Platforms”, which is available for free. This report explains a concept which will fundamentally influence the way we deal with “own identities”, which then really could be something you’d like to call BYOI, even while it is not only about bringing but also about controlling.

So even with Life Management Platforms, there is no need for the BYOI buzzword. It is not mainly about bringing your own identity (and, by the way, a Facebook ID is anything but an “own identity” when looking at the Facebook terms and conditions), but about enabling the flexible use of different identities. So BYOI is far too narrow to describe the changes we see these days. And thus we really should avoid using that buzzword and focus on what really is changing around identities.


Services
© 2014 Martin Kuppinger, KuppingerCole