Dynamic Authorization Management and ABAC: The journey is the reward

30.05.2014 by Martin Kuppinger

Chinese philosopher Confucius is said to be the originator of the saying “the journey is the reward”. What does it mean? In its historic meaning, it says that by moving forward people will benefit, even while they might not reach perfection. Applied to projects, it means that continuous improvements, new understandings and small successes over time are the reward – not the ideal end-state.

In IT, a project might never reach its desired end-state, at least not at enterprise scale. One example is what is commonly referred to as Dynamic Authorization Management (as a discipline) or ABAC – Attribute-based Access Control – (as a theoretical concept). Organizations might succeed in a particular project on Dynamic Authorization Management, but they will rarely manage transforming their entire Identity and Access Management in such a way that every single authorization decision is made dynamically, using a central authorization system and relying on one or more attributes (i.e. attribute-based).

There is no doubt that Dynamic Authorization Management is the better way for authorizing access to information and systems, compared to statically assigned entitlements at the system-level or the lack of a valid, fine-grained authorization concept. Relying on centrally managed policies provides many benefits: consistency of authorization policies, always up-to-date policies, and reduced administrative efforts, to name just a few. Another important point is that Dynamic Authorization Management allows making authorization decisions in the context of the user, if integrated with versatile, risk- and context-based authentication.

While the discussion about RBAC (Role-based Access Control) versus ABAC (Attribute-based Access Control) is somewhat artificial and theoretical, moving towards Dynamic Authorization Management is a must for mature IAM/IAG infrastructures. There are too many advantages. Notably, Dynamic Authorization Management is not new. Some of today’s products came to the market back in the 1990’s. In mainframe infrastructures, Dynamic Authorization Management even dates back to the 1970’s.

However, there are four challenges:

  • Existing applications
  • Software architects and developers
  • Providers of Commercial off-the-shelf (COTS) software
  • Cloud Service Provider (CSPs) and standards bodies

Most existing applications do not support the externalization of authorization decisions to a Dynamic Authorization Management system. Changing such applications is at best expensive and cumbersome, but for many applications this is just impossible.

Software architects and developers might be hard to convince to change the way they implement security (or what they believe is security). Despite the fact that IAM/IAG and software development commonly are separate siloes in IT organizations, this is the challenge that is easiest to solve. Explain the need and provide simple interfaces to the Dynamic Authorization Management system that make the developer’s life easier, not more complex, and you will succeed.

For providers of COTS software, things are more difficult. They rarely support standards such as XACML (Extensible Access Control Markup Language) to interface with Dynamic Authorization Management systems. Even while you might have a well-working gate from procurement to Information Security, that does not help unless the COTS software provides the required interfaces.

Things become even worse with the Cloud. There is just no adequate authorization standard for the Cloud yet. Given the fact that a very significant portion of Cloud services still lacks support for basic standards such as SAML (Security Assertion Markup Language), this is no surprise. This will change, but it will take a while.

There are some workarounds such as applying Dynamic Authorization Management at the level of XML Gateways, API Gateways, or Web Access Management solutions. However, there will remain many applications which just can’t be moved to Dynamic Authorization Management within a foreseeable period of time.

Despite these challenges, Dynamic Authorization Management is a must for every organization in maturing their IAM/IAG infrastructure and improving Information Security. Thus it is latest time for evaluating these concepts and starting to use them.

But even then, Dynamic Authorization Management must be considered as a long journey, where every single application on-boarded is considered a reward.


Posted in Dynamic Authorization Management | Comments Off

How to identify attacks? Know your enemies – and what they already might do.

26.05.2014 by Martin Kuppinger

In a panel discussion I had at EIC 2014 with Roy Adar, Vice President of Product Management at CyberArk, Roy brought up an interesting number: according to research, attacks start on average 200 days before they are detected. Taking into account the Gaussian distribution behind this average, some attackers might have been active for years before they were detected. And who knows whether all of them are detected at all.

How to react to this? There are several elements in the answer. Protect your systems with various layers of security. Use anti-malware tools, even while they won’t catch every malware and every attacker. Encrypt your sensitive information. Educate your employees. These and other “standard” actions are quite common. But there is at least one other thing you should do: analyze the behavior of users in your network.

I do not mean user tracking in the sense of “do they do their job” (which is hard to implement in countries with strong worker councils), I’m talking about identifying anomalies in their behavior. Attackers are characterized by uncommon behavior. Users might access far more documents than average or than they did before. Accounts might be used at unusual times. Users might log in from suspicious locations. Sometimes, it is not a single incident, but a combination of things, eventually over a longer period of time, which is typical for a specific form of attack, especially in the case of long-running APTs (Advanced Persistent Threats).

There is an increasing number of technologies available to analyze such patterns. Standard SIEM (Security Information and Event Management) tools are one approach, however analysis of anomalies might be difficult to perform based on rules. However, there is an increasing number of solutions that rely on more advanced pattern-matching technologies. These can, based on specific mathematical algorithms, turn log events and other information into patterns (in fact complex matrices), and analyze these for anomalies. There might be some noise in the sense of false negatives in the results, but this is true for rule-based analytics as well. Combination of such analytical technologies can make a lot of sense – if you bring together specialized analytics for areas such as Privilege Management (for instance, CyberArk’s PTA), User Activity Monitoring, pattern-based analytics, and traditional SIEM, you might learn a lot about these anomalies and hence about the attacks that are already running and the attackers behind them.

From our perspective, all this is converging into a new discipline we call Real-Time Security Intelligence (RSI). There is a new report out on that topic. I also recently wrote another post on RSI.

Even while you might feel it being too early to move towards RSI, you should put your focus on how to learn more about the attackers that are already inside your network. Understanding anomalies and patterns with new types of analytical technologies might help.


Posted in APT, Information Security | Comments Off

The Future of Corporate IT

06.05.2014 by Martin Kuppinger

When looking at today’s IT, it is driven by some major evolutions. Everything which is done in IT has to take these evolutions into account. One is Social Computing. The second evolution is Mobile Computing. The third evolution is Cloud Computing. All these trends affect IT fundamentally. The consumerization and deperimeterization of IT are logical consequences. Information technology (IT) is available to virtually everyone and virtually everywhere.

When looking at the Future of IT Organizations, Cloud Computing has the biggest impact. With the rise of Cloud Computing, IT managers and Business started to feel that the internal, on-premise IT needs to be able to compete against attractive external offerings. The IT Supply Chain is changing fundamentally. There are far more suppliers within reach. This evolution is neither new nor surprising. It is just that IT overall is moving from manufacturing to industrialization. For IT Organizations that means they either have to adapt to that new age of “industrialized IT” or they will fail.

diagram

The Future IT Paradigm by KuppingerCole, a standardized model for building your future IT, provides the guideline for organizations to move their IT Organization and IT Infrastructure to the next level and to make it future-proof. It helps in fulfilling the major business requirements:

  • Provide the services that business really needs – agile, just in-time, cost-effective, and in the way business really needs them
  • Enforce Information Security and protect the sensitive business information and intellectual property of the organization
  • Mitigate your IT risks, stay compliant, and enforce an enterprise-wide Governance approach

Looking at the Future IT Paradigm by KuppingerCole, it becomes obvious that the key of the new IT Organization is the segmentation of IT according to the layers defined in this model. But there is much more. It is also about creating new roles and responsibilities in the IT Organization. It helps IT Organizations in re-gaining leadership and making their on-premise IT production state-of-the-art again. The Future IT Paradigm by KuppingerCole consists of three layers – plus the Governance infrastructure and IT & Security Management.

Business Service Delivery focuses on providing exactly the services business needs, in the way business needs them, and on time. It is all about interfacing Business and IT. This is where Business/IT alignment moves from a buzzword towards reality.

Service & Information Management is what we also could name “Core IT”. This is where services are managed and where IT services are transformed into business services. This is also where IT Security is enforced. And it is the level where Information is managed.

IT Service Production is about producing services and providing them to the business. This layer supports all types of production environments, from on-premise to any type of clouds. These production units have to provide services in a standardized way. Best of all, they are themselves organized according to that three-layered structure, by understanding the output they provide as the business services for their customers, i.e. IT itself.

For a full view on that model, an in-depth description of what it means for the IT Organization and which structure, departments, and skills are required, have a look at the KuppingerCole Report #71,200 The Future of IT Organizations.


Real world face recognition and where paper beats the smartphone

01.04.2014 by Martin Kuppinger

A few days ago, I was I was travelling in a local train, together with a business partner, from my office in Germany to an event in another city. We both learned a lot about the real world challenges of face recognition.

While I already had a 24-hour ticket for travelling in and around that city, the business partner needed to extend his. He used his smartphone and the app of the railway company to do so. So far, so good.

A few minutes later, a conductor arrived. Verifying my printed ticket was a matter of seconds. Verifying the online ticket turned out to be far more complex. First, the conductor needed to scan the QR code of the online ticket displayed on the smartphone of my business partner. He did so using his own smartphone. It did not work with the original size, so he requested the business partner to enlarge the QR code display. Eventually that worked.

However, there was the need for a second factor, so to speak, to ensure that this was really a personal ticket of my business partner. The conductor ‘s app provided the name of the person with the ticket plus the detail that he was using a discounted pass, valid for one year. The business partner showed the annual pass, with the number and a photo of himself printed on the front page. It turned out that this was not sufficient – the face recognition simply failed.

My business partner had to take the discount card out of his wallet, display the backside with his name printed on it, and finally the ticket was validated.

Overall, this took more than a minute. Face plus number of the discount card plus the possession of a smartphone with the valid ticket was not sufficient. In sum, this was cumbersome, inefficient, and costly for the railway company. Imagine what it costs when you need approximately 10 times as long for verifying tickets. Either you check fewer tickets or you need more conductors. Both cost, either lost revenue for more people travelling without valid tickets or higher expenses for employing more conductors.

While the face recognition issue was new to me (but funny for the two identity people travelling), the other aspect is very worthy of consideration, because it appears to be a common challenge. I have observed this in other countries as well, where it takes far longer to verify online tickets than it takes to verify paper tickets. Maybe it is sometimes worthwhile to look at the real costs, before the “modern” (but less than perfect) online solution is put into place. Not that I am against online tickets etc. – but I definitely would prefer more efficient solutions. Another post on this topic is here.


Posted in Strong authentication | Comments Off

Real-time Security Intelligence – more than just “next generation SIEM”

14.03.2014 by Martin Kuppinger

Recently  a spotlight has been shed on the need for investing in Information Security solutions. The increase in cyber-attacks, the consistently high level of internal challenges, the appearance of more sophisticated types of long-running attacks (sometimes called Advanced Persistent Threats or APTs), the concerns regarding cyber-security following the Snowden revelations, the permanent challenge of dealing with Zero Day attacks leaving no time between becoming public and attacks happening: All this has led to an understanding for the need of better solutions.

Organizations have to assume that the attacker is already in their network. Every organization and every user is a potential target for attackers. On the other hand, with the increasing sophistication of attacks, it is becoming more difficult to identify the attackers. Finally, there is no such thing as the single perimeter anymore where organizations can place their security systems to prevent external attackers from entering the network. They might already have found their way via mobile devices, they might attack cloud services, etc. Complexity is increasing.

We see a new category of solutions evolving in the market that promise to help customers better solve these challenges. First, though, let’s look at current solutions which are not sufficient.

Standard IDS/IPS (Intrusion Prevention/Detection Systems) in their concept as edge devices are obviously limited when there is no such well-defined perimeter. They also are limited when it comes to complex attack scenarios, involving a number of systems.

SIEM (Security Information and Event Management) is still, typically, a tool-driven approach that requires heavy customization. Unless you are able to configure these systems correctly, they will not deliver on your expectations in the setup of, for example,  an SOC (Security Operations Centre). When it comes to taking more and more real-time information into account for the analysis, they might show limitations regarding their scalability.

Next Generation Firewalls again are an edge device, suffering from the conceptual limitations of such devices.

Services providing real-time security information  – regarding newly detected zero day attacks, for instance – deliver valuable information, but they don’t fix the problem. Furthermore, they do not provide the analysis of what is happening in the internal infrastructure.

Recently, though, we have observed a growing number of vendors moving towards integrated methods for Real-time Security Intelligence, combining various technologies and services:

  • Big Data analytics, enabling the analysis of large amounts of data, based on both rules and patterns;
  • Support for both real-time analytics and historical analysis, which can facilitate identifying new events as being related to those that occurred sometime in the past;
  • Integration to existing sources of information, including SIEM tools;
  • Integration with real-time security information services that provide up-to-date information about newly detected security challenges;
  • Services that provide automatic updated rules and patterns for analytics, i.e. configurations that reduce the need for customers to manually keep the configuration of the Real-time Security Intelligence systems up-to-date;
  • Services that support customers with analytics, i.e. expert services supporting the customer’s SOC;
  • Integration with IT GRC solutions, translating the identified challenges into risk information visible in dashboards for IT and business people.

Real-time Security Intelligence will become a mix of services and software. It will combine various offerings that exist today, but are separate from each other. It will allow customers to get a better insight into what already is happening in their networks and what currently is going on. Some vendors even provide the capability of changing network configurations, based on their analytical services.

We expect to see rapid evolution in this area, with further services to be added. A strong potential is in integrating network configuration management systems with Real-time Security Intelligence, allowing firewall settings, for example, to be changed on the fly. Another example is integration with SDCI (Software Defined Computing Infrastructures) to adapt the configuration of networks, storage, and virtual machines when new security challenges are identified, to automatically and dynamically minimize the attack surface.

This evolution towards Real-time Security Intelligence that we observe as of now, has some vendors focusing more on Big Data security analytics while others put more emphasis on online services, but this is just scratching the surface. There will be fundamental changes in the way we do security and we run SOCs, going well beyond just being “Next Generation SIEM”.

Learn more about Real-time Security Intelligence and how to successfully deal with your cyber security challenges at the upcoming EIC 2014. And don’t miss our upcoming webinar on “Mitigate targeted attacks with privileged account analytics” – not about Realtime Security Analytics primarily, but about one approach on mitigating the risks of becoming a victim of targeted attacks.


Posted in Information Security | Comments Off

The end of the Social Login begins: FIDO Alliance, Samsung, and PayPal to redefine authentication

06.03.2014 by Martin Kuppinger

Recently, the FIDO Alliance announced that PayPal and Samsung were enabling consumer payments with fingerprint authentication on the new Samsung Galaxy S5. My valued colleague Dave Kearns and I have written various posts about the FIDO Alliance and the impact we expect they will have on the market of strong authentication and BYOI (Bring Your Own Identity). Have a look here, here, and here.

What first reads like one of these unexciting press releases I receive in huge quantities daily is in fact about a groundbreaking paradigm shift that will have massive impact on device vendors, strong authentication technology providers, and – last but not least – on social networks.

FIDO is all about enabling users to rely on one personal digital identity, their “own identity” in BYOI, to access various services. Not only that, it is also about enabling BYOI with strong authentication and, finally, getting rid of username and password authentication. While the Samsung/PayPal case is the first large use case for the FIDO Alliance, this is just the beginning. Looking at the long list of members of the FIDO Alliance, others will follow. Users then can access various services, relying on strong authentication and a locally managed identity on their smartphone. In addition, Samsung will not remain the only device vendor delivering FIDO-enabled devices.

Obviously, that will affect many markets. Strong authentication vendors, device vendors, services acting as Identity Providers, etc.

It especially will have a massive impact on social networks. A significant part of their attractiveness is that many of these have become an Identity Provider, supporting the “social login”. This is part of the business model of social networks – users are bound to the networks and the social networks learn about user behavior, which is at the core of their business model. However, there is a downside to that from a marketing perspective, as I recently explained. Aside from that, social logins commonly lack support for strong authentication.

When the FIDO Alliance success continues, the need for social logins – the most common way for BYOI –will disappear. Why should users rely on social logins when they have a more secure way to authenticate, built into their devices of choice? With the beginning of the end of social logins, an important part of the business model of social networks start to crumble away. And that is the real big news behind the recent announcement of FIDO Alliance.


Posted in Social networks, Strong authentication | Comments Off

The Mt. Gox Bitcoin disaster and the need for innovation in the finance industry

05.03.2014 by Martin Kuppinger

A few days ago, Tokyo-based Bitcoin exchange Mt. Gox appeared to be in trouble. When looking at their website Friday morning, I only found meaningless announcements. They are “working very hard to find a solution to our recent issues”. Looking at the situation realistically, chances are high that the owners of the Bitcoins have lost a significant part, if not all, of their money. Just a few hours later, the news spread that Mt. Gox has gone bankrupt. while it is still unclear what exactly happened and what will happen now with the Bitcoins and Mt. Gox, this sheds a light on the concept of Bitcoins. Bitcoins were claimed to be absolutely safe. However, when you cannot use them but instead lose your “money”, this obviously is not the case.

There are good reasons for having trusted parties in the Finance Industry. Despite all the turmoil that industry went through in recent years, but also back in the Big Recession and in other times, the concepts worked relatively well.

On the other hand, the initial success of the Bitcoin currency also demonstrated that there might be a need for other concepts, aside from traditional currencies and the way financial transactions are handled. Even while the concept of Bitcoin might have been the wrong answer, that discussion will continue. Aside from requiring a trustworthy provider and exchange infrastructure, there are other questions to answer. One is about security, with an increasing number of attacks. Overall, there is a strong trend towards crypto-currencies. We will see a lot of evolution, we most likely will see failures and disasters, but it is not likely that crypto-currencies disappear again.

It will be interesting to observe how the Finance Industry reacts to that pressure. While Bitcoins are the most prominent topic these days, PayPal and other new players in the mobile and online payment market probably are the bigger challenge to the Finance Industry. PayPal in fact is a specific new type of Financial Institution. PayPal is a bank that knows how to provide APIs and how to interact with other players. It knows how to support the supply chains. It knows how to find the balance of security and customer convenience.

On the other hand, Financial Institutions still are trusted, when it is about money. They know how to do security. The challenge is how to make the banking business fit for the changing landscape of the Computing Troika (of Cloud Computing, Mobile Computing, and Social Computing) and enable them to provide their proven services for a world of consumerized IT. It is about API-enabling that industry.

However, that is more of a technical perspective. In fact, it is about moving banking IT to a level that allows Finance Institutions to leverage their strengths while becoming agile enough to compete with new players in the market. There is a strong potential for trusted Financial Institutions to do so. However, that requires banks looking closely at API Management and Security, BYOI (Bring Your Own Identity), trust and privacy concepts such as the Life Management Platforms.

EIC 2014 will dive into this topic in the Finance Industry Roundtable on the “Future Model of Banking”. Discuss and learn how to enable business agility by doing the things right in IT.

I personally believe that classical financial institutions have a strong potential in the future Finance business, despite Bitcoins and other concepts. I also believe in regulations. There is a good reason for regulations in the Finance industry. Having such regulations in place might have avoided the situation Mt. Gox and Bitcoins are in today. That is where the established Finance Industry comes into play: Making crypto-currency more secure by providing professional services, complying with the regulations. Regulations will come for that field – and then, the Finance Industry has an advantage again, if it is agile enough to support these new models by then.

Clearly, you might argue that the main value of crypto-currency is not about having a regulated and safe method of payment but one that is not traceable. It was Silk Road that brought Bitcoin to prominence. The question is whether there is a need for crypto-currency aside from the dark side of the Internet. I think to, with crypto-currencies being the “cash” of the Internet. No transaction fees, no fees for exchanging into other currencies. There is a potential value in using that type of currencies.

However, regulation and anonymity do not necessarily exclude each other. Take the analog world as an example: cash money lets me buy whatever I like anonymously, but the place where I deposit my cash (bank) is regulated. Banks should try to be clever enough to provide the trust that is created by regulation to the crypto-currency world.


Posted in Security | 2 comments

The new ABC: Agile businesses – connected

05.03.2014 by Martin Kuppinger

Agility is a key capability of successful organizations. Agility is the ability to quickly adapt the organization and the business model to new customer demands, innovations, and a changing competitive landscape. We live in a time where virtually all business relies on IT. Whether this is retail, finance, or life sciences – business requires IT. The consequence is, that IT has to support business agility. No IT agility = no business agility.

One of the biggest changes we are currently observing is the evolution from stand-alone to connected businesses. New collaborative business models, tighter and more flexible integration of customers and business partners, and the upcoming IoEE (Internet of Everything and Everyone) are driving the evolution of businesses. Cloud Computing, Mobile Computing, and Social Computing, the so-called “Computing Troika”, are already consequences of the business demand for agile and connected IT.

The challenge in this evolution is finding the balance between the business demand for agility and connectivity on the one hand and the IT and Information Security requirements on the other. Information Security can no longer think in terms of perimeters, devices, and system security. There is no closed perimeter anymore. Devices are under constant change. Systems might become Cloud services the next day.

The other part of the challenge is managing the users. Instead of focusing on the employees and a few business partners, there is a demand for rapid on-boarding and off-boarding of customers and business partners in changing business and collaboration models. And there is the need to on-board employees to business partner systems, to manage users in industry collaboration networks, and to manage user access to Cloud services.

Information Security in these days of the new ABC are primarily driven by two evolutions. First there is flexible user management that allows IT to manage the access of all types of users to all types of services – external users and internal users, on-premise IT and Cloud services. Having a (one!) user and access management infrastructure in place to support this change is a key success factor. This infrastructure commonly consists of a mix of on-premise and Cloud IAM.

The other fundamental shift is in what we protect. As the term “Information Security” implies, it is about securing information. In the new ABC, securing information is at the centre of attention. Technologies such as Information Rights Management allow for Secure Information Sharing.

IT that will succeed in supporting the business demand for agility and connectivity will have to move from traditional perimeter and device security towards information-centric approaches and a flexible user management for all types of users. Identity and Information are the new perimeters for security, not the firewall or the device. Rethink your IT and Information Security – get ready for the new ABC.

Sachar Paulus: Security Leadership in the Connected Enterprise


Secure Information Sharing: Which approach to choose?

28.02.2014 by Martin Kuppinger

There are various approaches to Secure Information Sharing (SIS), as I have explained in previous posts. However, which one is the best? As always, there is no simple answer. It depends on the requirements of the customers. Nevertheless, the various product categories have their strengths and limitations.

Let’s look at the categories within SIS first:

  • IRM: Information Rights Management is about technologies that encrypt documents and assign entitlements. Users can only open the documents if they are entitled. Applications enforce the entitlements such as limitations on printing, sharing, editing, etc.
  • Secure Data Rooms: This category provides secure data stores. These data stores can be accessed by various persons, allowing them to share information. A typical use case is sharing data in merger & acquisition processes. Typically, online editing is allowed but downloading is restricted, so that these solutions also can enforce restrictive entitlements on documents.
  • Collaborative Networks: These networks typically are focused on industry collaboration and provide environments that enable not only information sharing but also the management of the users and other functions. The obvious limitation is that they do not enforce entitlements on documents once these are downloaded. However, combination with IRM is potentially feasible.

When looking at these three concepts, IRM appears to be the best choice. The challenge for now has been, that IRM solutions had their challenges in managing (external) users, that they were lacking broad application support, and that most of them were rather complex to implement. As mentioned in a previous blog post, Microsoft has removed these barriers with its Azure RMS service. Thus, IRM is now an approach that any organization should consider to fulfill its need for SIS. Aside from Microsoft, there are some other players in the market, such as Nextlabs, Covertix, Watchful Software, or Seclore. They might work well for specific requirements.

The strength of Secure Data Rooms is primarily that they are “ready to use”. Instead of setting up an IRM infrastructure – which even based on Cloud offerings requires some planning – they can be used immediately. Thus they are a good solution for rapid deployment. However, IRM appears to be the more sustainable concept.

Collaborative Networks have a somewhat different role, because they provide value-add services for communities within industries. They are not only a tool but a service. The larger the community, the higher the value.

All approaches to SIS have their strengths and their weaknesses. However, there is good news: There are sufficient mature options now for SIS to finally start the SIS program in any organization. There is no argument anymore for collaborating with business partners without SIS in place.

Don’t miss EIC 2014 – it will be the place to learn more about Secure Information Sharing.


Posted in Information Rights Management | Comments Off

Why Apple’s culture of secrecy is your biggest risk in BYOD

27.02.2014 by Martin Kuppinger

The news of the bug in Apple operating systems has spread this week. As Seth Rosenblatt wrote on cnet, Apple’s culture of secrecy again has delayed a security response. While there is a patch available for iOS, the users of OS X still have to wait.

I have written before about the risks Apple’s culture of secrecy imposes for users. There are two major issues:

  • Apple does not inform either adequately or in a timely manner about security issues. Doing that is mandatory, including providing detailed information about workarounds and patches.
  • Apple still does not have an adequate patch policy in place.

It is well worth reading Ropsenblatt’s article, as it provides a number of examples for the consequences Apple’s culture of secrecy has from a security perspective. I can wholeheartedly agree with his final paragraph:

“With its history of lengthy response times to critical security problems, Apple is equally long overdue for a serious re-evaluation of how they handle their insecurities.”

However, the culture of secrecy is just a consequence of Apple’s “we are the best and don’t make errors” hubris – a long tradition of Apple. They positioned themselves as the counterpoint to the error-prone Microsoft Windows products a long time ago. While Microsoft has learned its lessons in software quality, patch management, and security response and patching, Apple did not. Apple has to learn that continuous improvement and a good approach to security response and patching is required for any vendor, even Apple.

This attitude of Apple also impacts the risk evaluation of BYOD strategies. If you can’t trust the vendor, you have to protect yourself. So what can you do, if you do not want to simply ban Apple devices until Apple provides an enterprise-class approach on security responses and patching?

The simple yet expensive answer is: Invest in additional BYOD security measures. There are various options out there, none of them being the “holy grail” for mobile security. However, if you combine information- and identity-centric approaches for security with mobile security, you should be able to better know and mitigate your risks. Unfortunately, doing that means spending even more money to secure expensive hardware without an added value. That’s a high price to pay for the users being allowed to use Apple devices.

There will be a price to pay in terms of restricted use. This might be by limiting access from insecure apps (and there are some that are affected by the current bug) or by temporary access restrictions in case of newly detected bugs, unless these are fixed. There might be a need for relying on other, more secure apps, for instance for accessing e-mail, instead of the built-in apps. As always: there is a price to pay. If you don’t want to carry the risk Apple puts on you with its inadequate security policy, you have to invest in security and you will have to restrict use of these devices, impacting user’s convenience.

Unless Apple changes its security culture and overall attitude of “we are the best and don’t make errors”, the advice must be: don‘t trust any organization that relies on a culture of secrecy. And care for security yourself.


Posted in Information Security | Comments Off
Services
© 2014 Martin Kuppinger, KuppingerCole