06.05.2012 by Sebastian Rohr
My dear friend Mia Harbitz of the Interamerican Development Bank (www.iadb.org) has recently linked me to of what I felt to be one of the most important papers on “Identity Management” since I work in this field. The paper does not analyze the pros and cons of doing bottom-up or top-down role design, nor does it dive into the depths of Access Governance and streamlining reconciliation efforts in your organization.
It investigates what any of you claim (and probably experienced yourself) to be a birth-right: your own personal identity! We all know the fuzz around Google+ and the headache it gave Kaliya “Identity Woman” when she was blocked from using G+ due to not using her “real name” but a moniker she was widely known under – at least better known as under her real name (which I only found out during the discussion around G+!). The paper – I recommend you all read – does not care about these problems which seem SO huge to us, but merely touch a small fraction of all mankind (which is, by the way, true to about 99% of the problems I solve during my work…) . It cares about the problems of billions of people not even HAVING an identity, because they did not get registered by their mother upon birth and thus do not have a valid a birth certificate.
Without further ado, please all read the paper “Travelling the Distance: a GPS-based study of the access to birth registration services in Latin America and the Caribbean” … It is an eye-opener to the problems of the “real world of identity management” and we as the crusaders of the digital world should not leave behind our fellow humans on the side of the “digital divide” …
27.01.2012 by Sebastian Rohr
I still remember the fun that was had when Dick Hardt first made his cool presentations on User Centric Identity Management and regaining control of who would access to what attribute of your multiple personas, be it online, at home or at work. We all know, that his company sxip identity failed because it did not gain enough momentum to monetize on the idea. Still, concepts such as the (also “failed”, much to my demise) Information Cards by Microsoft or the OpenID approach share some aspects of the sxipper product – putting you in control of your data. The current hype around the new EU privacy and data security legislation is putting some more focus to this!
Apparently, only very tech savvy users – geeks like you and me – seem to widely adopt and use OpenID. I, personally was attracted to Clavid, a Swiss IDP who combines OpenID with the one thing missing everywhere else: Strong Authentication! Most of you know that this is sort of my pet topic here at KCP and so I was really amazed to see them offer Yubikeys, Avionics’ Internet Passport and even SwissID Government issued certificates as a means of strong authentication – making Clavid an early representative of the prospering “Authentication as a Service” market segment. Not prospering enough, I guess, as I did not see the Clavid guys buying fast cars and castles at Lake Geneva’s´ shores…
Anyway, the concept of letting us – the users/consumers/customers – decide on who gets access to which detail of my life and (digital) identity remains an unsolved issue. Be it the tedious task of filling out forms after forms to get your kid into day-care or getting new insurance for your car – you have to share information about yourself and your loved ones and wonder: do they REALLY need that info? And if so: why do they ask me the same questions over and over again?
Wouldn´t it be nice if more of these form-fields could be “auto-filled”, depending on your choice of what to disclose and what not? Wouldn´t it be great to have one common place to securely store all the insurance information, account information and whatnot? Just like putting your valuables in a bank deposit box (or your high-security safe in your secret lair downstairs, depending if you are a super villain or not)? You could even “compartmentalize” your life into stuff belonging to work/career (like digital versions of all your certifications and endorsements), your personal leisure activities (like memberships in sportsclub and your fishing license, Open Water Diver certificate), your kids info (school district, Headmaster contacts, the football team coach) and the list continues.
I recently tried to gather my families´ core identity data, such as passport and ID card numbers, SSN, healthcare ID, tax ID etc. and it took me full Sunday. Last week I did it all over again, as I misplaced the sheet of paper I used – pretty old school, don´t you think?
But all personal stupidity aside: wouldn´t it be great to use that “digital vault” full of your own personal data to actually ERASE all the personal detail that are stored at the gazillion of companies and organizations you interact with day to day? Why must I put my CC info and full address with “your airline of choice”, if I could use their services “pseudonymously” and only allowing access to those details “on demand” while I actually book a flight? Currently, if I lose my CC or it expires the internet economy burdens me with changing my CC info in each of the gazillion pages I do business with. Why?
I am looking forward to a (hopefully very near) future, where I can actually manage my data in one place and have those who need access to it authorized on a configurable basis. Sure, my employer should have continuous access to my bank account information! But if I am leaving – how can I make them erase that info on file today?
Look put for some colnew announcements and blogs on KCP on this – my colleagues will provide more info as it becomes “freely available”
04.01.2012 by Sebastian Rohr
Well, the time between the years (usually today referring to the days after Christmas until New Years Eve – but did you know these were historically the twelve days between December 24th and January 6th which served to align lunar and solar calender years? But I am getting too much off-topic…) is used to reflect about the year passed. There are a few things and events that absolutely impressed me in 2011, which I like to talk about a litte!
First, there was the spring event European Identity Conference (EIC – www.id-conf.com) which had a great impact from my personal point of view. I never had so many interviews, briefings, talks and sessions to host in that short amount of time. But instead of feeling exhausted and depleted when finally traveling home that Friday, I felt energized, motivated and inspired! So many interesting people to talk to, so many vibrant sessions and panel discussions to follow – and a really delicious catering all the time!
Second, the autumn event IT Security Area (www.it-sa.de) in Nuremberg. A tradefair by design, it was also packed with a decent conference framework programme and the three official stages in the exhibition area had a rather impressive set of security speakers such as Prof. Taher el-Gamal, Martin Schallbruch of the State Department or Horst Flätgen of Federal Office of IT Security. Though spanning a much larger scope than EIC, Identity Management and Privacy Protecting Technologies were key topics discussed.
Finally, there was one vendor event which really impressed me a lot. Being a former CA, Microsoft and Siemens employee, I do know what large corporations are able to pull off regarding trade-fairs and exhibitions as well as “in-house events”. But comparing a Microsoft booth at Cebit, a CA InExchange or similar events just did not do well. Ok, Microsoft TechEd, SAP SAPPHIRE and CA World are all a close call. But Oracle OpenWorld in San Francisco this year was by far the most exciting and entertaining event!
Let me give you a little impression of the breadth of topics that I (as an Identity, Privacy & Security Analyst) was confronted with:
- Big Data
- Cloud Services
- Database Management (doh!)
- Secure Programming Guides and Secure Development Programmes
- Hardware and the opportunities of full HW/MW/SW Stacks (see? I did not use “advantage”!)
- Bring Your Own Device (yeah, many Oracle people had personal “i”-devices with them!)
and many more!
Ok, the topics I can really give an insight on where the following:
It really looks like Oracle assimilated the Sun Hardware Business – the racks could be seen all over the space in San Francisco at Oracle Open World. Of more interest to me, was how they would present their integration efforts in the IAM space, as they had also acquired a large amount of Intellectual Properties and code around role-mining and attestation from SUN.
Sadly, they did not really make that a topic but continued to refer to their „suite“, which from my point of view still lacks some deep-end integration regarding the OIA (Oracle Identity Analytics). At least it looks like the 12g releases will deliver on that. I meet with some happy customers though, who had deployed this „component“ of the suite and they were all boasting how easy it was to setup and how they could impress their management with quick-wins. Well, that was always „inside“ the products core, which I had the honor to work with during previous engagements. What I felt was missing a bit, is to stress the actual „power of the suite“: if you deploy OIA for analysis and re-certification (attestation), it is (or at least should be) a natural choice to have that co-deployed with OIM and get all the changes delivered automatically. There is Integration, and Oracle worked a lot on that behind the scenes. But there is still some way to go, for example by having one workflow system instead of two for OIA and OIM – again something that is said to become available with 12g.
Another point that needs to be addressed with the suite offering is a much more customer centric approach of visualizing which component can help with which problem – a simple mapping would suffice! That would also help their field engineers and pre-sales staff which sometimes appear a little uncertain about which component to use when and about the dependencies of components.
So, it is nice to hear about deeper integration of the Fusion middleware component areas and how they work to together to make our life more enjoyable, but having some clear communication about “what fits where” in the IAM arena alone would help them a lot. Once the components (and please do not rename them again) went through that “matchmaking” from a marketing/sales perspective, everyone could better draw the lines and delimit what functionality comes with which component and how to combine elements to receive the expected functionality.
The last issue about selling an IAM suite I was curious about still remains unsolved: what to do if customers already have some components in place and will not want to migrate those? Selling a suite into a large organization may be like dumping a large black monolith into their IT. Having the components sharply delimited but at the same time tightly integrated is a key requirement for the vendor to successfully sell the suite. Keeping open interfaces and providing the customer the freedom of choice for selecting a competitive component for – let us say provisioning – is a key for customer success with their IT-landscape integration. While these goals seem to be contradictive at first, they become the same if you live up to your own pledge to support open and well documented standards and interfaces. As soon as all components of a suite support the same set of standards and interfaces, they are clearly delimited (hopefully) and can be mixed and combined to better match the actual requirements customers have. The big black monolith referred to above, then converts into a nice set up easy-to-connect Lego® bricks that enable customers to build their own suite. Given that the Oracle IAM suite in fact consists of many building blocks and that Oracle has a clear vision for (and is delivering on) a service-oriented approach to consume IAM services – the Oracle Service Oriented Security – they are well positioned to tell a much stronger story here than they sometimes do.
The real Cloud – now available at Oracle (and only there!?)
According to the first entertaining minutes of Oracle CEO Larry Ellison´s keynote at Oracle Open World, Oracle is now the only vendor to offer a real cloud – whatever that is supposed to mean. At least Hasan Rivzi elaborated a little more of the details how to register, pick services, select the payment plan (!) and then get the service created and defined. I am so happy about that update, as Larry rather concentrated on bashing that certain other Cloud vendor, whose CEO-keynote had been “postponed” the day before. At least in Germany, bashing the Co-Opetition is not considered good business conduct. At least not if you continue to brag over 90 minutes how much their services are inferior to your own (which have not even materialized yet). Well, as mentioned, Hasan explained in more detail how PaaS and IaaS offerings will be shaped and differentiated from the competition. A big focus will be on Java-based offerings, but my main points of interest were that key things like “Complete Isolation” of the different environments, SSO for the applications created, Centralized User Management with Delegated Administration for all of the above as well as Identity Federation between internal and Cloud Applications. That will be accompanied by “caging” resources and dedicated virtual machines per client, to keep the customers more secluded and to avoid “leaking” of data between environments. Another nice point to add: Data Integration is supposed to make moving data to the Cloud and back from the Cloud to your internal apps easier. Still unclear how that will actually work out, though.
I will return this year to see how the Suite approach was refined and how my (and some highly respected analyst folks) advice was used to push the capabilities of existing modules!
12.12.2011 by Sebastian Rohr
My colleague Jörg Resch just gave us a summary on the current status of new EU Privacy Regulation that is “in the works” in Brussels. If only a portion of this becomes “EU Law” – meaning that it will not be a Directive which needs to be translated into local national law but supersedes any existing national law – it will change the game in an instance. Not only would the “amusingly small” fines that could currently be imposed e.g. German companies for breaking privacy laws (standard maximum fine 50.000 €) be bumped up to “significant” numbers, but the actual provider of a service could be held liable for not protecting the data of his customer (or his customers´ customer, that is). Currently, if your company uses any kind of (IT) service and your customer data is disclosed by errors or omissions on behalf of the Service Provider, still your company will be sued and needs to pay the fine as you did not execute proper Governance in your contract with the Service Provider (hence I´ve been promoting the need for good information security governance paragraphs in each outsourcing contract!). In other words: although your Service Provider failed to deliver secure services and neglected his responsibility to provide the high quality and security that you expected from a professional vendor, you are being held accountable for the improper action that lead to the disclosure.
Looks like this is going to be changed! Or at least, the EU will try to change it…Behold of the Lobbyists!
Sometimes fate has it, that two corresponding subjects are discussed in parallel – as I talked to my old friend Peter Schoo of recently formed Fraunhofer AISEC in Munich-Garching. Just before I received Jörg´s summary on the progress of EU Privacy Law, I discussed with Peter what has been happening regarding Privacy Protection and Anonymity in the market. Recently, my point of view on gathering “customer information” and the process of storing this information to create a “customer profile” has changed dramatically. Besides the fact that this more or less in contradiction to Germanys´ data protection laws (referring to “Daten-Sparsamkeit” here), marketing experts always constructed some sort of “need” to justify this compilation. Especially the “REWE incident” where thousands of customer home addresses and other personal information was ripped from a marketing driven exchange platform (through this site, kids could swap the stickers they harvested with each of Moms shopping trips to REWE stores) made me feel like having this data had become more of a liability/risk than creating benefit/opportunity.
This is where Peters´ newest creation comes into play – his team created a tool called “Prividor” which stands for “Privacy Violation Detector”. It basically spiders a website and checks for any issues with data protection and privacy legislation that this site or portal may have. As some consumers are beginning to revert to a more strict handling of personal information, those “concerned users” would definitely feel more comfortable browsing for “special information” on sites that respect the privacy of a user. Especially government-owned sites or information portals that handle sensitive topics such as cancer, HIV infection or even “erectile dysfunction” would benefit largely. Imagine the user browsing for these things and receiving even more “blue pill” advertisements than usual or getting sponsored ads for cancer treatment on the next portal you visit – not what you fancy if you are really struck by that health condition!
Well, people with extensive Facebook (or name your favorite social network here) usage will probably not even think about such things, but a growing number of “concerned users” will. Now take into account what the EU seems to be aiming at and – voilà – demand for a “privacy protecting web-design” of any kind will rise instantly.
As I said, sometimes fate “makes may day”
Looking forward to your feedback, dear readers!
Oh, and here are the links, for the curious ones…
24.10.2011 by Sebastian Rohr
We have been discussing IRM, DRM, DLP and other acronyms back and forth for a quite a while now and I am sure there are a good bunch of solutions out there for those organizations, that have policies and procedures in place to sufficiently plan, build and run thus a tool. Thus, I was pretty much „meh“ about any discussions revolving around the pros and cons of approaches…
Well, our close friends sometimes surprise us with problems, we never seem to have „seen“ before. One of those friends runs a small System Integrator / VAR company and approached me with a problem, that is common among these service providers: handling of RMAs…
Usually, if you have outsourcing agreements and service contracts, you would also have a number of SLAs that cover the use, transport, protection and security of data and mobile data storage devices such as flash-disks, thumb-drives or the very useful external hard drives, which are used to back-up full Virtual Servers if no SAN/NAS is available on-site.
Well, these SLAs cover exactly that: the STANDARD operating procedures and day-to-day handling of those devices. But what happens, if one or more of the external hard-drives becomes defective and is not accessible because the controller is broken? You just had a full back-up pushed onto that drive last Friday and – during your standard tests of back-up media – you find the disk to irresponsive due to controller failure. You KNOW that your client´s full data-center including Domain Controller, Exchange and ERP systems are on that drive. You are unable to read the data, you can also not delete the drive and you cannot “open” the casing because it voids the warranty under which you would like to get the drive replaced by your vendor/distributor.
Actually, you would have to send in the defective drive as-is (with all your client-data on it) and wait to have it replaced or repaired. If replaced – what happens to the “raw disks”? They could easily be put into a computer or hooked up to another controller and data extracted. If repaired, the controller will be exchanged and at least QA tests will reveal the sensitive nature of the data stored…
According to the System Integrator community it is impossible to negotiate a special data-protection agreement with the Distributors, as their margins are already too low to invest in legal advisory regarding a set of 150 € products. Also, the clients are rather unwilling to sign a waiver, which reduces or fully removes liability for any data breach from the SI. I would really LOVE to talk to some lawyers of the HD manufacturers and/or Distributors about this topic, as I fear that a large number of these RMAs happen without any thought about data protection…
31.01.2011 by Sebastian Rohr
Back to the roots – Strong Authentication is my topic of the month. To be more precise, the combination of several methods of strong authentication all managed through one central, versatile system, allowing both high-security solutions with high cost per authentication and mass-market easy to use methods for low to medium security settings. Versatile Authentication Services/Servers/Platforms are key to low TCO and high usability for different user segments and use-cases.
I already finished most of my market analysis and am currently compiling the report. If you feel the urge to let me know about your fresh product/solution in that segment, let me know Just comment this post or write to me !
Thanks for your input – see you all at EIC2011 to discuss the results!
23.10.2010 by Sebastian Rohr
The press release of HID acquring ActiveIdentity almost slipped my sensor network, despite the fact that I had the honour of having some close contact to top-level HID guys this week. I am totally positive about this acquisition, as HID now is able to get their hands on some really good Versatile Authentication Server (VAS) with AI´s 4Tress product. This is what they need to really set a mark in the authentication industry, because their NaviGO tool was a good starting point but it really lacks the quality and integration some of the other tools feature. HID is brand new to “software”, but they heavily invested in own resources to come up with NaviGO – thus it is a natural thing to seek some established brand or set of tools and accelerate the (obviously successful!) strategy of becoming of THE vendors for VAS. I am confident that this is a good match, but with all acquisitions there come friction and loss and talent.
So, seeing things the other way round, it may be hard for the AI guys to actually blend in with the HID guys. At least, they had some hardware tokens for their own and now they just have access to some really good contact and contact-less card readers. Hopeyfully, the differences in style, attitude and go-to-market can be aligned, as the Identity & Access Management market is definitely something else than the PACS and RFID reader market space.
To sum things up, I think the two of them make a great match! Still I ask myself if the two worlds can be merged by such a take-over? Everybody knows I am a both a big fan and big promoter of holistic security concepts and convergence as such – it would benefit the market to see this merger evolve into some true Convergence VAS products!
04.08.2010 by Sebastian Rohr
The recently published document on protecting credit card data during processing and storage with tokenization technology has gathered quite a bit of response (see for yourself http://usa.visa.com/download/merchants/tokenization_best_practices.pdf). As others like Mr. McMillon of RSA said before (http://www.rsa.com/blog/blog_entry.aspx?id=1687), it is an overall good approach – and my very recent experience with CC data processing in outsourcing environments proves to me that solutions for this are in great demand. Besides the “nit-picking” (please excuse, we are totally on the same page here!) about calling encrypted CC data a “token” (which it is NOT…), there are some issues about the general approach shown by VISA. First, it is absolutely positive to see any progress and innovation around securing payment methods and payment processing, either at the PoS or online (and there are nice solutions for both environments readily available in the market, such as nuBridges offering, for example). Second, it is advisable to contribute to standardization and commonly accepted methods – isn´t it? Well, it looks like VISA – with all due respect for their effort to make this world a safer place! – has failed to get broad 3rd party support (such as e.g. funnelling this through the PCS DSS commitees or having it openly reviewed by experts) . It remains a mystery (at least to me) why VISA chose to spearhead this alone. The overall feedback received from experts around the world is a mixed bag of “well thought, but has major weaknesses”.
Thus, it is definitely worth a look if you have a need for securing CC data in your systems and guidance is needed on how to define certain aspects. On the other hand, it is advisable to compare the VISA best practices with what the “other” stakeholders such as Mastercard, Diners, Amex and the like may add or edit. From my personal perspective I applaud the advances made by this project but I clearly dislike the fact, that VISA did this on their own, effectively putting an extra burden on banks, merchants and all others dealing with CC data to harmonize with deviating requirements that may be published by other companies. I sincerely hope that the payment card industry does not fall into a “deny-all” mode but instead that a revised version with support from industry organizations such a the PCI DSS council is made public any time soon. Until then, I recommend reading, understanding and cross-checking the VISA best practices for tokenization with the extensive feedback already available from industry experts around the globe. The time for protecting CC data and other PII is definitely NOW, and good tokenization can help to reduce the leakage of such information!
08.04.2010 by Sebastian Rohr
Just recently my Strong Authentication report has been published and now there is one vendor less in the scope: French-American card and token giant GEMALTO announced that it acquired the niche player TODOS:
Todos has some very interesting tokens, but I am pretty sure that Gemalto was just after the Todos´IP around online-banking security. Unknown to most of the world, it is Todos (or now Gemalto) that owns the technology that secure online banking solutions are based upon. Hopefully, Gemalto does not mess up those solutions too (remember the Debit/Credit Card frenzy that broke loose when the Gemalto chips on many German cards failed to operate after 2010 “suddenly came around the corner” and cards were not able to work with the date input?). Being a victim of this bug myself, I strongly hope the product scope and expertise of Todos will remain with Gemalto – I have deep respect for the achievements of the Swedish experts!!!
24.02.2010 by Sebastian Rohr
Coming from a network security background, for me “IPSec 3DES VPNs” seemed to be the solution for secure data transfer between business partners for quite a long time. Over the years, with more experience, I naturally found out that this was not the solution for all use-cases and scenarios these crazy folks called “customers” came up with. Nonetheless, when SSL-VPNs became en-vogue I hesitated to join the choir of supporters. While I fully understand and support the idea of a more flexible, more application or user-centric approach due to the gain in usability, I still love my “old VPN client” when connecting to the company resources.
During the last 13 month two projects kept me busy, that changed my personal perception of what one may need to be happy regarding secure access to resources and secure file transfer. One of those is largely related to “Cloud Computing” as such, and using//processing company data which is not stored inside my brick + mortar, perimeter secured, firewall protected company server but somewhere in the “internet”. Making sure only the right person with the right credential accesses this data makes me want to use strong authentication, but few of the Cloud service providers do offer such an additional layer of protection.
The other project was based on very Information Society 1.0 processes – the need to secure and protect the personal subscriber information of periodicals and daily newspapers that are exchanged between the publisher and the logistic service provider who manages the delivery of above mentioned print products – even if the subscriber is on vacation in Spain or recently moved to new address. These transfers are conducted between separate systems, distributed all over Europe. As most of these application systems are build individually, no real data standard is established. As the number of parties involved is high and participants change frequently, classic VPNs are out of question (and possibly “too expensive”). Thus, the need to protect data transfer (yes, it is based on FTP!!!) is obvious. Well, have you ever tried to create a solution that acts both as a server AND a client and supports FTP, sFTP, FTPS and other cryptic siblings of the FTP protocol? No? Well, you should not!
Being a big fan of hardware, a.k.a. token-based, strong authentication mechanisms, vendors of non-hardware based mechanisms usually have a hard time convincing me that it is worthwhile paying attention to their product briefings. MultiFactors´ Garret Grajek was one of those CTOs whom I was giving a hard time until I finally arranged an appointment for a briefing. What can I say? The approach to using soft-certificates as second factor for authentication and the combination with out-of-band (a.k.a. SMS based) messaging during registration of a computer/session did impress me – because it was so simple and straight-forward! Especially for me, who uses multiple devices in parallel to access e.g. my mail, registering my personal computer at home or my clients´ laptop in the customer network to access Outlook Web Access this really did the trick. Ok, the downside is, I still need to log-in with my AD credentials – but this is something I criticized with Entrusts´ GRID authentication scheme, also (which I love, because it is such a low prized alternative to OTP tokens). Back to my project experience with outsourcing and “Cloud Services”, MultiFactor now has launched a nice extension which makes this approach available for use with services such as SalesForce.com and GoogleApps by leveraging federation technology. Now, I have to admit, this is something one can hardly achieve by using their own smartcard or token based authentication technology – especially not if one frequently changes the machine used. I guess if this approach can be tied into an Authentication Strategy and could possibly be supported by one of the Versatile Authentication Platform solutions, I could be a full supporter of these ominous “soft-tokens”.
Still, this does not help directly with my friends´ subscriber data, that needs to be updated daily. Fortunately, last Friday I had a briefing with nuBridges, a vendor of data protection tools that target both data at rest and data in motion. For the data at rest part, tokenization, scrambling and obfuscation of data – especially sensitive information such as credit card information – can be altered and stored in such ways that unique identification is still possible but leaked data would essentially be worthless. I won´t go into too much detail on this, but my experience with outsourcing and out-tasking applications that also handle payment transactions tells that there is some need for this. I was by far more interested in their secure data transfer solution, called nuBridges Exchange. Again, without going into too much technical detail, this solution provides a nice standard-of-the-shelf product to securely handle multiple parties exchanging large quantities of files in a secure way. Besides support for all varieties of secure data file transfer protocols, the most important fact is the streaming capability of the solution. The files in transfer are not stored on the receiving end of the transfer connection but rather streamed onwards to a protected internal storage system. As the receiving server sits in-between two firewalls and the “inbound streaming” transmission through the internal firewall is initiated by the control server inside the secured area, no open ports need to be put into the internal firewall system. As time for a first briefing usually is insufficient to go into much detail, I was unable to investigate the architecture and implementation further, but both management interface, report dashboard and the availability of a self-service portal for the business partners made a rather good overall impression. I am looking forward to further investigate these solutions and for sure will take a closer look at their Exchange Network service, also – especially as protecting credit card data at the point-of-sales and between PoS and central merchant systems seems to be attracting the attention of auditors lately.
What do you think about protecting data transfer and authentication/authorization strategies in a Cloud-environment? Let me know!