19.03.2013 by Martin Kuppinger
When looking through the security related news of the past two weeks, there is very little that is surprising. Again, the usual topics such as discussions about whom to accuse of cyber-attacks and about newly found attack vectors have led to a series of news articles. There also have been ongoing discussions around privacy. However, as I have said and stated in my previous security blog post: Most topics remain the same. Some weeks it is about routers, this time reports about security weaknesses in connected HP printers and some other routers (TP-Link) spread the news.
However, there have been news articles on two topics that caught my attention.
Trend Micro on ICS/SCADA security
Trend Micro published results of a test they have run to analyze the real security threats for ICS (Industrial Control Systems) and SCADA (Supervisory Control and Data Acquisition Networks) networks. These environments have been under attack by Stuxnet, Duqu, and Flame over the past years.
Trend Micro chose a small town in California and installed a virtual pumping station with a control system for water pressure. They made the station visible in the Internet. All software components existed, but no water pumps. They created three different “honeypots” with the typical weaknesses found in real world environments.
Within roughly one month, Trend Micro detected 39 attacks out of 14 different countries. The leading countries were China (35%), USA (19%), and Laos (12%). At least twelve attacks appeared to be targeted. One or more attackers repeated 13 attacks on different days. These obviously were targeted and automated. Trend Micro is still investigating the other attacks.
Clearly, there is a well-established ecosystem for espionage and cyber terrorism out there. No single organization with industrial production environments and no single organization in the “critical infrastructure” area can claim that it is not an attack target. It is past time to act and to better protect all IT environments in organizations.
Obama vs. Merkel
I also found some news articles about Obama hosting a meeting on cyber-security with CEOs and on putting cyber-threats amongst the top topics in his call with the Chinese president. This helps increasing awareness in the industry, in governmental organizations, etc.
When looking at Germany, the situation is quite different. There are infrequent statements and activities from some of the ministries. There are some activities by different governmental organizations. However, there clearly is a lack of public statements and attention from Angela Merkel, if I compare this to Barack Obama. At CeBIT fair 2013 she visited, for instance, the booth of a provider of secure smartphones, the “Merkel phone”, which allows her secure, encrypted/scrambled communication. I think that putting the cyber-threats at the top of the agenda would have been far more important than putting the focus on that phone (and the technology provider behind). Time to wake up, I’d say.
06.03.2013 by Martin Kuppinger
Yesterday I spent a day at the CeBIT fair, still the world’s largest IT fair. Besides the many interesting meetings I had previously scheduled, I started thinking about the CeBIT “Leitthema” – their “claim of the year”. This year it has been “Shareconomy”. I still do not know what this term shall mean. There is some fuzzy description at the CeBIT homepage, but in contrast to topics like “Cloud” and “Managing Trust” in 2011 and 2012 respectively, Shareconomy – described as “sharing and using information, resources and experience based on new forms of collaboration” – is a very amorphous concept. They then try to associate it with crowd sourcing, smart infrastructures and smart grids, data security, big data, etc.
In fact, I think that there is something behind this rather strange buzzword. Back in September 2012, KuppingerCole hosted an event about the 4Cs: Communication, Collaboration, Content, and Cloud, which was about enabling new ways of collaboration and communication in a secure way. That probably is what the Shareconomy is all about.
When I look at our advisory business, I see another red-hot topic. In German I’d call it “Umgang mit Dritten”, i.e. how to interact with third parties and services provided by these in a consistent, standardized way. That is about Cloud Security, Identity Federation, API Economy and security therein, etc. Opening up the perimeter and supporting business processes that integrate business partners, customers, etc. is highly important. So maybe that is also part of the Shareconomy. For sure, you will be able to learn a lot about this at our upcoming EIC – the real stuff, not the marketing buzz and fuzz. To highlight just some few sessions:
However, the thing that confused me most at CeBIT – in the context of their Shareconomy claim – was the lack of free WiFi. Sharing without connectivity? Or at least sharing without free or affordable connectivity? Will that work? I doubt it. I used my UMTS cards in the notebook and iPad respectively, because I otherwise would have had to pay 30 € for a 4-hour WiFi pass. That is far more even than in the old school hotels that still charge for WiFi. Ridiculous.
06.03.2013 by Martin Kuppinger
One of the topics I’ve been evangelizing for years is Dynamic Authorization Management. Dynamic Authorization Management is about externalizing authorization decisions outside of applications. It is about using an “application security infrastructure” which performs the authorization decisions (and manages other aspects of security like authentication, the administration of users etc.). It is about relying on security services instead of implementing security in every application.
Dynamic Authorization Management is often associated with XACML (eXtensible Access Control Markup Language). XACML in fact is a standard to implement Dynamic Authorization Management, but the concept must not be limited to XACML. In fact, Web Access Management systems implement the concept of Dynamic Authorization Management in a coarse-grain approach and some of these systems as well as some of the Policy/Entitlement Server products available provide their own, proprietary APIs.
Before discussing the best approach to implement Dynamic Authorization Management it is important to understand the basic principles and their benefits. Within the concept of Dynamic Authorization Management, an application asks the authorization system for authorization. It provides some information with this request, e.g. the user ID etc. Depending on the implementation, other attributes might be delivered in addition. The authorization systems take this information and collect additional information if required. It might ask an authentication system for more context information, receive roles from a directory service etc. It then uses that information and the business rules (authorization rules) received from a policy repository to decide about authorization. Having done that, it provides the decision back to the requesting system.
The obvious advantage is that applications do not need to manage users, authentications, or authorizations. They just ask a central (logically central, but potentially physically distributed and logically “partitioned”) system. There is no longer a need to manage authorization rules within the application. Thus there is no need to provision that information into that application.
That in consequence means that there is also no on-going need to revoke access. IAM (Identity and Access Management) is not about “ensuring that access is revoked correctly” anymore, because there is nothing to revoke (from applications). There is also nothing to grant anymore within the applications.
Everything is managed centrally. Changes are made centrally and become effective immediately. While Identity Provisioning will decrease in relevance, Access Governance will remain important. Identity Provisioning will have to cover far less targets than today, when few central instances are used as repositories and target systems do no longer hold authorization information locally. Access Governance will have to move from reviewing static access control in target systems to reviewing dynamic business and authorization rules in the central authorization system – a feature which is supported by some early adopters in the Access Governance market.
A strength of this concept is that such systems not only can enforce standard authorization rules but also business rules. Many role management projects suffer when it comes to supporting “competencies” or “constraints”, e.g. limits for the approval of POs etc. This is fairly simple to implement and enforce in Dynamic Authorization Management.
The concept in fact is not really new. In the mainframe world, it has been around at least since the mid ‘70s – you just need to look at tools like RACF, but also several proprietary implementations of large organizations for their “entitlement management systems”.
However, there is no such thing as a free lunch. The obvious challenge is performance – can such a system be fast enough for today’s business needs? The best answer is given by the users of these systems: Large banks and large eCommerce sites are relying on these approaches today.
The biggest challenge in reality is that applications have to change. That in consequence means that the way applications are architected and developed has to change. The mindset of application architects and application developers has to change and these groups have to collaborate closely with the IT Security and IT Infrastructure people. However, done right architecting and coding applications will become easier given that architects and developers no longer need to ‘bake in’ authorization, authentication, etc., but can simply rely on the external service. Obviously, providing lean and simple approaches for Dynamic Authorization Management is a key success factor for this type of technology.
Dynamic Authorization Management is not about a rapid change, it is about moving towards a better model over time. To do that, you should start now. Every single application is a win on that journey. Security risks and complexity of management will be reduced. And Dynamic Authorization Management will allow you to focus on the key issue: Allowing people to do exactly what the business wants them to do (and not more) – instead of technically granting and revoking access per application.
As always, there will be several sessions around Dynamic Authorization Management, XACML etc. at this years’ EIC: Munich, May 14th to 17th.
06.03.2013 by Martin Kuppinger
When I’ve started writing this series of blog posts recently I thought that I will have sufficient material for a weekly post. However, when looking consequently at the security news of various sources it becomes obvious that there are a few recurring topics:
- New (and old) waves of attacks and new and old types of malware
- New exploits – the target of choice differs, the topic always remains the same
- Discussions about privacy
- Vendors with inappropriate security patch policies
Yes, sometimes there are interesting announcements from vendors. However, besides the new big data approaches of IBM and RSA Security I have covered before, there has not been great news this week, despite RSA Security Conference in the U.S. and the CeBIT fair in Germany starting today (which, by the way, still is the largest IT fair worldwide).
Let’s have a quick look at the most important news.
Java as the new target of choice
It comes to no surprise that there are an increasing number of attacks using Java exploits. This includes some of the known exploits, but also some new ones. This also is not surprising given that hackers look for related weaknesses once a particular type of exploit has been identified. In consequence this means that Java updates have to be performed regularly and that the use of Java (especially within the browser) has to be carefully reconsidered.
Privacy vs. Freedom of Speech?
I read a fairly strange article on a lawsuit Google is facing in Spain these days. The article argues that the privacy debate over here in Europe is around “Privacy vs. Freedom of Speech”. In fact the argument raised therein is that Google is allowed to publish a link based on the Right for Freedom of Speech. Notably, this right exists in Europe as well, not only “Fair Speech” as the author assumes. And the idea behind Freedom of Speech in Europe is to protect the individual, not only the society – which is in stark contrast to what the author says. Maybe the difference is that Europeans do not tend to protect questionable business models and principles through one of the fundamental human rights. From my (European) perspective, the article is based on a fundamental misunderstanding and misconception of what is considered the European position. Notably, there is not the single European position but an intensive debate about these topics.
There is little change in the news around cyber-attacks. There are still masses of attacks and the discussion about who is behind these attacks is continuing. There is good reason to assume that some part of the attacks is state-sponsored, while others are caused by cyber criminals. At the end it is about accepting that there is a severe risk for any organization and any individual and that we need to protect ourselves in a more sophisticated way. In a Trend Micro press release I received yesterday, the author compared it with the “fork” in chess play where you create two threats at a time. The other player can’t defend against both at the same time (but he might threaten you in another way). The argument of the author has been that based on a fork, i.e. multiple defense layers, the attackers are always in danger of being detected. I’m not sure whether the fork is the best pattern in chess to compare with and whether this is not more the approach the attacker could take – but I liked this analogy.
The victim of the week has been Evernote – they reported that some data has been hacked and asked all of their users to reset passwords. Who will be next?
28.02.2013 by Martin Kuppinger
Last week I received a newsletter from Radiant Logic, a vendor of Virtual Directory Services and some other IAM stuff like Federation Services. This newsletter pointed to a video of a presentation of Gartner analyst Ian Glazer titled “Killing Identity Management in Order to Save it,” which had been published on February 7th, 2013.
In this video he spends a lot of time talking about some topics like
- IAM is too static and typically HR driven
- IAM is not focused on providing services and integrating with business applications
- IAM is based on LDAP (and CSV) and other hierarchical approaches
- 2013 will be the year of Identity Standards, especially OAuth, OpenID connect, and SCIM
- Identity Service like those provided by Salesforce.com
When I read the newsletter of Radiant Logic – which take a fairly different view than Ian Glazer – and listened to the webinar, I started looking for some of the stuff my colleagues and me have written about this.
There is for example an article at our website talking about the fact that HR should not be the only leading system for IAM – the article dates back to 2007 (and is available in German only). And there are more, which are about things like the Identity Explosion and the need to deal with far more users.
I found several articles for example from back in 2008 looking at Identity Services and there were webinars and reports around that topic years ago. Some vendors have been doing integration of Identity Services into business applications, Oracle for example, for years now.
The end of LDAP in its current state was the topic of a blog post back in 2010 and I started discussing this with advisory customers at the same time.
Oh yes, clearly the standards mentioned will become more important this year. My colleague Craig Burton has described this on several occasions, including the KuppingerCole Scenario “The Future of Authentication”. And last year’s EIC hosted a workshop talking about the relevance of all these upcoming standards.
All the topics around identity services hosted by Salesforce.com or Microsoft’s upcoming Windows Azure Active Directory have also been a frequent topic in Craig’s blog posts and in some of our research notes.
There is nothing wrong with these theses. However, there is also not that much new in them.
Below the link to the video of the Ian Glazer presentation, there is the following claim:
The way the industry does identity management cannot incrementally improve to me [sic] future (and current) needs. I believe IAM must be killed off and reborn.
Given the fact I do a lot of advisory work besides research, like all the KuppingerCole analysts, I really struggle with this claim. There is no doubt about the fact that we need to “extend and embrace” what we are doing traditionally in IAM. It is about more than Identity Provisioning. Topics like versatile and context-/risk-based authentication and authorization, together with Identity Federation, are moving towards the center of attention – not only for core IAM challenges. We need to understand that there are new challenges imposed by the Computing Troika and that traditional approaches will not solve these.
However, I do not believe in disruptiveness. I believe in approaches that build on existing investments. IAM has to change, no doubt about that. But there will still be a lot of “old school” IAM together with the “new school” parts. Time and time again it has been proven that change without a migration path is an invitation to disaster. Embrace and extend is the classical migration methodology for classical technical transformative strategies.
I plan to do a session on this topic at EIC 2013 – don’t miss it if you want to save your investments and spend your budgets targeted to meet today’s and tomorrow’s challenges in IAM.
25.02.2013 by Martin Kuppinger
OK, in fact this is about the last few weeks in security this time – but in future it will be most time about looking back at the previous week.
The permanent threats: Chinese hackers, Anonymous,…
Not a single week goes by without news about attacks from various groups. This includes Chinese hackers that are alleged to have attacked the Wall Street Journal or Anonymous that claimed that they have successfully attacked the US Federal Reserve. In the latter incident, it took four days from the announcement by Anonymous until the official statement of the US Federal Reserve. An additional cyber-attack hit the US Department of Energy, according to another news article.
There have been numerous articles about these attacks since, with different parties in the U.S. linking them to official Chinese agencies and the Chinese Army, while China denies these accusations citing a lack of proof.
Attacking the big ones
In this context, the recent attacks on Apple, Facebook, Twitter, and Microsoft (and possibly several other companies) also gained a lot of public interest. U.S. investigators assume that these attacks were driven by Eastern European cybercriminals rather than being Chinese state-sponsored, according to recent news articles.
Kaspersky kills Internet access for Windows XP users – accidentally
A recent Kaspersky antivirus update this month disabled Internet connectivity for Windows XP users at least partially. There is a workaround and a fix available; however, it takes some manual action to solve the problem – no surprise given that the Internet access does not work as expected anymore. Unfortunately, there is no prominent direct link to the information on this issue at the home page of Kaspersky.
Path app ignores privacy again
An article on CNET unveiled another privacy issue in the social network Path. Information about location data might slip out even when access to the location is disabled. Given that Path had some trouble with the FTC (U.S. Federal Trade Commission) recently and had to pay a fine, this new issue comes at the wrong time for them. It also again sheds light on the ignorance or incompetence of start-up companies when it comes to security and privacy – probably both. It will be interesting to see when the growing awareness and concerns of users finally leads to the consequence of not using such services anymore.
EU Commission introduces Cyber Security Plan
The EU Commission this week announced their Cyber Security Plan to strengthen resistance against cyber-attacks and cybercrime. The plan includes the idea of a European Cyber Defense Policy. It also includes the concept of an “attack notification obligation”. The latter led to some intense discussions because some companies do not want to inform the public about these issues. As of now, virtually all large organizations have experienced some form of attack. However, as of now, this is only discussed behind closed doors between the CISOs of these organizations. An attack notification obligation would change that and provide far more information to the officials. On the other hand, it will increase cyber security concerns in the broad public – which might be seen as a positive effect given that it might also increase caution.
A lot of router security issues
Last week, there were again several news articles about security issues of routers and other network devices, including D-Link. At least D-Link delivered some firmware patches, while other devices remain insecure. Which raises the question: Do you have patch management for the firmware of all your devices in place? Another interesting question: Which of the hardware vendors has a well-defined approach for security alerts and security patches in place? The bad news, when following this issue over the past few weeks, is that most vendors are neither willing nor capable of providing patches fast and in a simple-to-apply way. It is long past time for hardware vendors to start working on such an approach – and it is long past time for customers to have a complete patch management plan in place, from firmware up to applications.
Are stronger passwords really THE trend?
In its Deloitte TMT Predictions (Technology, Media & Telecommunications), the company predicts the end of “strong password only security”. The solution proposed is multi-factor authentication, and a little bit of password vaults. However, most of the text focuses on using stronger passwords, longer than eight characters. My colleague Craig Burton recently made the statement: “There is no such thing as a password muscle you can strengthen by training.” Which is to say: People are limited when it comes to keeping passwords in mind, and recommending the use of longer and more complex passwords is not the ideal solution. You do not get better when you have to keep many long and complex passwords in mind; you just consider workarounds like noting them down or re-using always the same password.
When talking about multi-factor authentication, I would rather say that this has been a topic for a “trend” some years back. Yes, we will observe some more implementations. However, multi-factor authentication by itself is not sufficient. Some two years ago, I blogged about the RSA SecurID incident. My recommendation at that time was to think about versatile authentication, combined with multi-factor authentication. Not that this concept was absolutely new back then…
Clearly, there is a trend towards approaches for strong, simple, and flexible authentication, beyond passwords. However, just talking about multi-factor authentication and password vaults is not sufficient. What organizations should evaluate are versatile authentication and, as the next and logical step, context- and risk-based authentication and authorization. That is the real trend. It is about understanding the bigger picture. Look at this to understand the future of authentication and authorization, not at a point approach.
In this context, it is definitely worthwhile to attend EIC 2013 – the future of authentication and authorization and the trends we observe will be an important part of the agenda.
02.02.2013 by Martin Kuppinger
Chinese hackers, US newspapers
This week, several US newspapers, including The New York Times and Wall Street Journal, have reported that they have experienced cyber-attacks related to their coverage of China. In the case of The Times, corporate passwords for every employee had been stolen. Chinese officials called allegations that the Chinese Government commissioned these attacks “unprofessional and baseless”. However, it is not likely that Chinese hackers caused these incidents without at least tacit government approval. In fact, this appears to be sort of a sideshow to the bigger, unofficial and hidden cyber-war (a 21st century sort of a “cold war”) running in the background.
Distrust as a business model?
In a recent survey, the Ponemon Institute asked U.S. adults about the five companies they trust the most to protect the privacy of their personal information. It comes as no surprise that most of the companies forming the “Internet Association” do not rank within the Top 20 of this list. Some like Apple have been in the Top 20 for years. On the other hand, Microsoft is now amongst these top-ranked companies. Overall, the Internet and Social Network providers have low ratings. It will be interesting to observe whether “distrust” as a business model really works over a longer period. The study clearly shows that users are aware of privacy risks. The greater this awareness, the bigger the business risk for the ones who are ignoring these concerns.
UPnP networking flaw puts millions of PCs at risk
A recently discovered security flaw in the UPnP (Universal Plug-n-Play) networking protocol potentially puts millions of PCs at risk. UPnP is a protocol that allows network devices like printers, PCs, or routers to discover each other. By design, this discovery should be limited to the local network. However, the flaw allows attackers to identify devices on the internet and run some well-known attacks against them. The reason for the mass of vulnerable systems is that software libraries used to implement UPnP contain some flaws. Most likely, many of the systems at risk never will be patched because these devices are not sold anymore. Thus, there is a significant risk. Unfortunately, there is no simple solution to this issue. The best approach is ensuring that all incoming UPnP requests are blocked at the router and that this device itself does not use UPnP.
Where will all these people come from?
According to recent news, the Pentagon has decided to increase staffing for its cybersecurity force from 900 to 4,500 people. The most important question this news evokes: Where will all these people come from? I have no clue. We are in a situation where we lack experienced IT security professionals. Hiring 3,600 more of this rare species will be a tough job for the Pentagon. It also will wipe the market for cybersecurity professionals. For other companies and organizations that means they will increasingly rely on Managed Security Service Providers which at least can benefit to some degree from “economies of scale”. The most important challenge with respect to cybersecurity for every economy in the upcoming years will be to force education of IT security professionals. Not only IT but IT security has to become part of education, starting in school. And IT security as a field of study should become one of the most attractive ones, to create the supply governmental and private organizations need urgently.
According to Canadian and Dutch data protection authorities, WhatsApp violates international privacy laws. Users do not have a choice to use the application without granting access to their entire address book. For company policies that simply means that usage of WhatsApp is unacceptable as long as any company-related address information is held on the device. Maybe WhatsApp should really start thinking about security and privacy.
Apple iOS 6.1 – still an unacceptable approach for security patches
Apple this week released iOS 6.1. The update addresses a number of security issues. Amongst these are around 20 that allow infiltrating systems via the Internet and executing code on the target systems. Most of the bugs are related to the webkit which forms the foundation for the iOS Browser Safari. Some of them have been known for quite a while, even while there are no known attacks based on them. Nevertheless, an approach that delivers security patches that are delayed and not just in time, as Microsoft, Oracle and even Adobe do, is simply inadequate. It is long past time that Apple move towards better approaches to security patching. By the way: The update once again deleted the specific APN settings of my UMTS card. Updates that are not able to keep all configurations are just unprofessional.
Online banking: 25% of Germans don’t use it due to security concerns
A survey in Germany, ordered by the initiative D21 (Digital 21), showed that 26% of the participants do not use online banking due to security concerns. That comes as no surprise, when looking at some of the recent incidents. It also sheds an interesting light on the investments of banks to secure online business. A common complaint of banks is that securing online banking is too expensive. That is the reason for not investing in the most advanced technologies or for charging customers for every SMS send out-of-band with a TAN (transaction number). However, besides the money banks spend for successful attacks, the cost of 25% of the customers still relying on classical banking methods with the manual handling involved and thus high costs should not be underestimated. Banks should also consider that this might be just an initial trend and the tendency may be to go back to traditional methods. That then would increase costs for banks. Investing more in really secure online banking might be the cheaper way.
IBM and RSA build security analytics on Big Data technology
A recent announcement from IBM, and information from RSA show that Big Data technology gains momentum as a foundation for security analytics. This goes well beyond traditional SIEM (Security Information and Event Management) and opens new opportunities for advanced analytics of data from various sources. More on that in upcoming blog posts, KuppingerCole reports, and at EIC 2013.
28.01.2013 by Martin Kuppinger
Recently a story about Google hit the news, according to an article in Wired, “Google declares war on the password”. Google wants to integrate this into the browser. Their approach is based on the idea of using a USB key or a NFC (Near Field Communication) device to log into applications. Currently, Google uses a YubiKey, developed by Yubico.
This brought my attention back to Yubico. Some months ago, I had a conversation with their CEO Stina Ehrensvärd. She unveiled some of the new devices Yubico is working on, including their YubiKey NEO, which supports both NFC and USB, and their YubiKey Nano, which is so small that it is designed to be put into a USB port and to remain there. There are other YubiKeys out there as well, but these two are the most interesting ones.
In contrast to other vendors, Yubico focuses on a “lightweight” approach with fairly cheap devices and little overhead. They also deliver free and open source software for the backend side, but mainly rely on partners. Customers can simply buy a YubiKey online, download the free software or turn to an enterprise software partner supporting Yubikey, including Quest Software, Duo Security, and Digital Persona. A growing number of consumers are also using YubiKey with password managers, including Password Safe, Passpack and LastPass. Adding Google to the list of partners would obviously be a very big deal for Yubico.
A more interesting question however is a simple one: is this approach good enough to really replace passwords? When we look at the authentication space, there are three factors: Knowledge, Ownership, and Biometrics. Quite some time ago, I wrote a report providing a market overview on conceptual approaches for strong authentication together with my two colleagues Prof. Dr. Sachar Paulus and Sebastian Rohr. This provides an in-depth analysis of strengths and weaknesses of different approaches.
First, having only a token will not be sufficient. That would lead to a one-factor authentication and thus wouldn’t be sufficient. When working with a username and password, there are at least two factors, both based on knowledge. You might argue however that an e-mail address as a means doesn’t count, given that this typically is public information.
Two-factor authentication also is not secure by design – there has for instance been a recent incident in online banking where both factors were attacked successfully.
Therefore, I don’t see the future in having just a device like the YubiKey. Everyone who has access to that device then could log on using it, if no additional factor is used. However, the combination of such a device together with a password delivers real two-factor authentication. When we asked Ehrensvärd about this approach, she clarified that though it was not highlighted by Google in the IEEE white paper that Wired has reviewed, the Yubico approach and vision is to always combine a YubiKey with at least a simple PIN or password.
One problem will remain in any case: your password is in your brain and available everywhere and anytime (unless you forget it). A token has to be carried around. In these days where most of us use multiple devices, this can become rather inconvenient. If we leave the token in the device, we opt for a rather insecure approach – everyone who has access to the device has access to the token as well. But carrying it around is not the choice of users, even with very small form factors like the ones provided by Yubico. NFC solves some of the problems, because such a device can be used for multiple systems, but you still have to remember to carry it.
I personally would prefer a credit card size form factor for a NFC device plus the choice between password and OTP (one time password), sent out-of-band to my cell phone, as the second factor. My wallet and the cell phone are the two assets I typically carry around. When discussing this with Ehrensvärd, she answered that many YubiKey NEO customers place the NFC-enabled Yubikey NEO in their wallet, and then tap the whole wallet to a smart phone or NFC laptop to login.
Despite the fact that the card form factor is still lacking from Yubico, their approach is quite interesting – and they might get a big push from Google in future. However, if you look for affordable approaches for stronger authentication, you should have a look at Yubico today. Even while it is not the perfect solution for the stronger authentication challenge, Yubico provides an interesting alternative.
18.12.2012 by Martin Kuppinger
Some weeks ago I stumbled upon an article, which said that the MDM (Mobile Device Management) market will grow massively within the next five years. I don’t doubt that the market will grow. However I’d raise the question whether it should grow that much – or, in other words, whether MDM is really the solution of choice. I don’t doubt that there is some need for MDM technologies. However, this might be more about understanding MDM as an element of other technologies or a tactical piece of a bigger puzzle.
Let me explain why.
The problem organizations are facing today is that there are more users, more types of devices, and more deployment models they have to deal with. They need to give their users access to the information they need (and thus the information systems they need), regardless of the device and the deployment models – but with enforcing information security and regulatory compliance. It is about the impact Cloud Computing, Mobile Computing, and Social Computing have and how to deal with it in a secure and compliant manner.
This “Computing Troika” means that we have to strategically change the way we are dealing with identities and access. We have more identities and we have to support more ways of gaining access – to resources which are sprawled across multiple deployment models.
Notably, this is not only about users with smartphones or tablets, the devices primarily in scope of MDM technologies (even while some grow beyond that to Microsoft Windows 8 or Apple OS X support). It is about a multitude of devices, from the classical desktop PC in the company, in the home office, or in an Internet Café; it is about notebooks of employees and all the different types of externals; it is about all the smartphones, tablets, and potentially devices we cannot even imagine today. And I’m not even speaking about the Internet of Things and M2M (machine-to-machine) here, which also is about some identities requiring access.
Can we solve this by managing mobile devices? Obviously, that can help. But it is far away from solving the strategic challenge. Furthermore, any approach which focuses on disparate management of a group of devices is questionable. Why not focusing on the solutions which help managing all types of devices, including the “traditional” ones?
Obviously, a device-centric strategy and differentiating between some different types of devices is not sufficient to solve the challenges of today. The same is true for network-centric approaches – if there is not *the* perimeter anymore, protection focusing on that perimeter is insufficient.
The future is about understanding the risk of information access and comparing it with the risk of the access request. The risk of the access request is based on the context, a topic my colleague Dave Kearns focused on at his EIC keynote some four years ago. Context is about the device, the location, the type and strength of authentication, the role of the user and thus also its relationship to the organization, the health status of the device, and many other aspects. If there is sort of a positive balance of information risk and access risk – fine. If not, the access risk either can be mitigated, for example by step-up authentication, or the access might be refused or at least limited.
That requires technologies like versatile authentication, risk-/context-based authentication and authorization, and Dynamic Authorization Management. The latter is required to enable applications to do dynamic authorizations based on policies and on the context, instead of hard-coding authorization rules or at best relying on coarse-grain decisions. It is about putting a risk- and context-aware approach to information security at the centre, instead of artificially protecting devices (instead of information) or perimeters.
MDM might help in mitigating risks for some devices. So it is a concept within that bigger picture. However, without understanding the bigger picture and addressing this, MDM is more sort of an alibi than a real solution. Furthermore, MDM in that bigger picture and with all the devices in mind which can be used to access corporate information (systems), there is a good reason to look for solutions which integrate MDM into a bigger scope – like Client Lifecycle Management solutions which manage all types of devices.
Nevertheless, the MDM market will grow for some time. However, it also will change, maybe quicker than many expect today. And, most important, there are other technical building blocks you should look at first, to address the cause and not the symptom.
Posted in BYOD
13.12.2012 by Martin Kuppinger
In a recently published study Versafe and Check Point Software Technologies, two software vendors, analyze the recent Eurograpper attack based on the Zeus botnet and ZitMO (Zeus in the Mobile). This attack reportedly diverted up to 36 million € by intercepting financial transactions.
The most interesting aspect of this is that the attack bypassed the out-of-band authentication of financial transactions. The banks use this approach to send TAN codes (transaction numbers) to the mobile phone of the user. It is out-of-band if (and as long) as the user uses another device like his PC for accessing the online banking application.
This approach has been considered (more or less) secure. However, in the scenario described, the attackers targeted both devices. They first attacked the PC. There they tracked and manipulated online banking sessions and asked for phone number and device type of the mobile phone used. Using that information, they send an “important security update” to the mobile phone which contained the malware for that device. Based on that they could intercept transactions and steal mobile TANs.
It is very likely that this has just been the beginning of such attacks, challenging the security of out-of-band mechanisms. As of now, the attack requires Windows on the PC and Android or Blackberry on the mobile device. However, it is just a question of time until iOS, OS X, or Windows Phone will also become “supported” by the attackers.
There is no simple solution to prevent such type of attacks. The most important security measure is good anti-malware protection on every PC and ideally on every mobile device – as long as there is such a solution for the mobile devices. Besides that, fraud detection at the backend, i.e. in banks, is mandatory to identify such issues as fast as possible and to alert customers.
But clearly this type of attack shows that out-of-band authentication, as (relatively) convenient as it might be, is not the holy grail of security in online banking. Maybe this issue will initiate the comeback of other, more expensive (procurement and logistics) and sometimes less convenient solutions like OTP hardware tokens. Maybe it is time that financial institutions start focusing on a reusable approach for such OTP hardware tokens because that always was one of the major inhibitors for acceptance of these devices.