24.04.2013 by Craig Burton
CA Technologies acquires Layer 7, MuleSoft acquires Programmable Web, 3Scale gets funding
It is clear that the API Economy is kicking into gear in a big way. Last week, Intel announced its acquisition of Mashery, this week, CA Technologies announced its acquisition of Layer7 , MuleSoft announced its acquisition of ProgrammableWeb and 3Scale closed a round of funding for 4.2M.
Money is flooding into the API Economy as the importance of APIs only heightens. Expect this trend to continue.
The upside of this flurry of activity is the focus being given to the API Economy.
But here is my assessment.
CA’s acquisition of Layer7 doesn’t necessarily bode well for Layer7 or its customers. CA as a large vendor will probably take longer than Layer 7 would do independently for defining and delivering on the roadmap, but they might put far more power behind such roadmap and its execution. Layer7 needs an upgrade and needs to move to the cloud. CA has a clear Cloud strategy it executes on – look at IAM and Service Management where a large portion of the products is available as cloud service; there is a strong potential for CA putting far more pressure behind the required move of Layer 7 to the cloud. Let’s see what happens there.
MuleSoft’s acquisition of ProgammableWeb is a little weird. John Musser is an independent well-spoken representative of the API Economy. MuleSoft has an agenda with its own platform. Does MuleSoft let Musser continue to be an independent spokesperson? Where does this lead to? All answers unknown.
3Scale gets a round of funding for 4.2M. It plans to add more extensions to the product and grow its international distribution with the funds.
Lots of activity here. Curious to see what happens next.
However, one thing is clear: The API Economy is going mainstream.
23.04.2013 by Craig Burton
From partnership to acquisition Let there be no confusion. Intel is a hardware company. It makes microchips. This is its core business. History shows that companies do best when they stick to their roots. There are exceptions. At the same time, Intel has always dabbled in software at some level. Mostly in products that support the chip architecture. Compilers, development tools and debuggers. From time to time, however, Intel ventures into the software business with more serious intentions. Back in 1991, Intel acquired LAN Systems in attempt to get more serious into the LAN utility business. This direction was later abandoned and Intel went back to its knitting as a chip vendor. Recently, Intel has started again to be serious about being in the software business. Its most serious foray was with the purchase of McAfee in 2010 to the tune of some 7.6 billion. A pretty serious commitment. We wrote recently about Intel’s intent to be a serious player in the Identity Management business with its composite platform Expressway API management. With that approach, Intel was clear that it had an “investment” in Mashery that would remain an arm’s length relationship best supporting the customer and allowing maximum flexibility for Mashery. In general, I like this approach better than an acquisition. Acquisitions by big companies of little companies are don’t always turn out for the best for anyone. Since then, it is clear that Intel management has shifted its view and thinks that outright ownership of Mashery is a better plan. While we agree that outright ownership can mean more control and management of direction, it can also mean the marginalization of and independent group that could possibly act more dynamically on its own. It is still too early to tell exactly how this will turn out for Intel and its customers, it will be important to watch and see how the organization is integrated into the company.
19.04.2013 by Craig Burton
When things go bad, it goes really bad
At KuppingerCole we use Office365 extensively to manage our documents and keep track of document development and distribution.
On April 9, 2013, Microsoft released a normal sized Tuesday update to Windows and Office products. The only thing is, this time the update completely broke the functionality of Office 365 and Office 2013. Trying to open a document stored in SharePoint would result in a recursive dialogue box asking for you to authenticate to the SharePoint server. Same thing would happen when trying to upload a document. Excel and PowerPoint documents had the same problem.
Going to the Office365 forum resulted in a bevy of customers complaining about the problem. A Microsoft tech support person was offering possible solutions, all of which were just time wasters and solved nothing.
“First, please run desktop setup by following Set up your desktop for Office 365
If the issue persists, please remove saved login credentials from the Windows credential manager and then sign into the MS account.
Finally, two days later a customer posted a solution.
“KB2768349 is definitely the culprit. I uninstalled this on Windows RT and login worked again across all Office 2013 RT apps. Reinstalling broke it. Uninstalling again fixed it.
Replicated on my Windows 8 desktop with Office 2013.
For the time being I have hidden KB2768349 from Windows Update until this is fixed.”
As soon as I deleted the KB2768349 update the problem went away. I also learned what “hiding” an update entails.
For those of you dying to know, here is how you fix this thing.
control panel>windows update>view update history>installed updates
Scroll down thru the Office 2013 updates until you find KB2768349. Select and then uninstall.
Of course once you uninstall an update, it’s going to show back up again and try to update. The way you prevent this is to “hide” the update so it doesn’t keep showing up. To hide and update, you open Windows Update and right mouse the update you want to hide and select “hide update.” There you go.
So for two days the normal operation of Office365 was frustratingly broken. Now this was not just for me and my colleagues, but for everyone on the planet that used Office365 and installed these updates. At the same time, the fix applies to everyone on the planet using Office365 as well. In other words, critical apps in the cloud that go bad, go bad hard. They also heal big. Part of the deal.
I was surprised that I was the only one tweeting and complaining about it. I didn’t see one article or public view on this major screw up. The only place I saw any complaining was on the Office365 forum. So glad that was happening.
17.04.2013 by Craig Burton
Identity Management is a universal problem
When I pay my electric bill I usually just call the power company and give them my credit card. This month I decided that I should go set up auto payments on the web site and be done with it. So I opened the power company web site and attempted to login. Clearly the site recognized me, the login name I usually use was being recognized, but I just could not remember my password. I tried all of the normal passwords I use and none of them were working.
So I attempted to retrieve my password, it gave me an option of having the password reset sent to my email address or answering secret questions. I opted to have it sent to my email address. I waited. Nothing showed up in my email box. I looked in the spam folder, still nothing. I went back to the web site and this time I opted for being asked the secret question…..”What is your favorite color”. Oh man, I don’t know. Depends on my mood and what day it is. I don’t remember what I put in there for my favorite color. Ok. Let’s try “Blue.” Good, that worked. Wow. I am in. Hey. This isn’t my account? WTF?
Now I know there are two other Craig Burton’s living in Utah. Apparently I have just accessed the electricity billing account of one of them by guessing both the user name and secret question. And the secret question was “blue?”
Off the top of my head I would say the Electric Company has a severe security leak in it.
Of course I didn’t do anything to this account. I could see that his email address was just sent another request to change the name and password. I hope he did that.
This was an ugly incident that could have been much uglier if I was malicious.
Here is my point, a uniform cloud-based Identity management system could be used to prevent this kind of thing. As it stands, every single web site has its own set of code used to prevent inappropriate access. A scenario bound to create the blatant hole I ran into.
Of course the other side of the coin is that if the cloud-based identity management system had a hole in it, everybody would have the hole. Then again, the fix would fix everybody. Trade-offs but I still think the cloud-based Identity Management as a Service is where we are headed in the future.
18.03.2013 by Craig Burton
With the rapidly emerging cloud-mobile-social Troika coupled with the API Economy, there are so many questions about how to design systems that can allow application access to internal information and resources via APIs that will not compromise the integrity of enterprise assets. And on the other hand, how do we prevent inappropriate personal information from propagating inappropriately as personal data stores and information is processed and accessed? Indeed, I have read so many articles lately that predict utter catastrophe from the inevitable smart phone and tablet application rush that leverages the burgeoning API economy.
In recent posts, I have posited that one approach to solving the problem is by using an IdMaaS design for authentication and authorization.
Another proposed approach—that keeps coming up—is a system construct that is referred to as the “Façade Proxy.”
A place to start to understand the nature of Facades is in an article by Bruno Pedro entitled “Using Facades to Decouple API Integrations.”
In this article Bruno explains:
A Façade is an object that provides simple access to complex – or external – functionality. It might be used to group together several methods into a single one, to abstract a very complex method into several simple calls or, more generically, to decouple two pieces of code where there’s a strong dependency of one over the other.
Figure 1 – Facade Pattern Design Source: Cloudwork
What happens when you develop API calls inside your code and, suddenly, the API is upgraded and some of its methods or parameters change? You’ll have to change your application code to handle those changes. Also, by changing your internal application code, you might have to change the way some of your objects behave. It is easy to overlook every instance and can require you to double-check multiple lines of code.
There’s a better way to keep API calls up-to-date. By writing a Façade with the single responsibility of interacting with the external Web service, you can defend your code from external changes. Now, whenever the API changes, all you have to do is update your Façade. Your internal application code will remain untouched.
To shed even more light on how a Façade Proxy is designed and can be used to address yet another problem is blog post from Kin Lane. Kin is an API Evangelist extraordinaire and I learn a lot from him in his writings. Kin recently wrote in a blog post entitled “An API that Scrubs Personally Identifiable Information from Other APIs”:
I had a conversation with one UC Berkeley analyst about a problem that isn’t just unique to a university, but they are working on an innovative solution for.
UCB Developers are creating Web Services that provide access to sensitive data (e.g. grades, transcripts, current enrollments) but only trusted applications are typically allowed to access these Web Services to prevent misuse of the sensitive data. Expanding access to these services, while preserving the confidentiality of the data, could provide student and third party developers with opportunities to create new applications that provide UCB students with enhanced services.
Wrapping untrusted applications in a “Proxied Façade Service” framework that passes anonymous tickets through the “untrusted” application to underlying services that can independently extract the necessary personal information provides a secure way of allowing an application to retrieve a Web User’s Business data (e.g. their current course enrollments) WITHOUT exposing any identifying information about the user to the untrusted application.
I find their problem and solution fascinating, I also think it is something that could have huge potential. When data leaves any school, healthcare provider, financial services or government office, the presence of sensitive data is always a concern. More data will be leaving these trusted systems, for use in not just apps, but also for analysis and visualizations, and the need to scrub personally identifiable information will only grow.
Finally, Intel recently announced its Expressway API Manger product suite. EAM is a new category of service that Intel is calling a “Composite API Platform.” It is referred as such as the platform is a composite of a premise-based gateway that allows organizations to create and secure APIs that can be externalized for secure access through a cloud-based API management service from Mashery designed to help organizations expose, monetize and manage APIs to developers. In its design, Intel has created a RESTful Façade API that exposes APIs to developers for internal information and resources of an organization. It is very similar to the design approach outlined by Kin. This approach looks to be an elegant use of the Façade pattern to efficiently manage authorization and authentication of mobile apps to information that needs to remain secure.
Figure 2 – EAM Application Life Cycle Source: Intel
I am learning a lot about the possible API designs—like the Façade Proxy—that can be useful constructs for organizations to successfully participate in the API economy and not give up the farm.
21.02.2013 by Craig Burton
Making an API is hard. It is also a tough question. A small company out of England has figured out how to let anyone make an API with just:
- A Spreadsheet
- A Datownia SaaS account
One of the activities I practice to keep up with what is happening in the world of APIs is to subscribe to the ProgrammableWeb’s newsletter. Every week the newsletter contains the latest APIs that have been added to the rapidly increasing list. While I seldom can get through the whole list, I inevitably find one or two new APIs that are really interesting.
Recently I ran into one that has an incredibly simple and effective method of creating an API out of a spreadsheet.
The Company is Datownia.com
I now have an API with a developer portal that is driven by data in a spread sheet.
I can distribute developer keys to any developer I choose and then that developer can access the data and integrate it into any app.
Further, any change I make to the spreadsheet get versioned and propagated to the API with just a click. To propagate the data, all I do is modify the spreadsheet and drop it into the linked DropBox folder.
Here is what my spreadsheet looks like.
Here is what the JSON look like when you make a restful call to the API location created for me by Datownia.
I have been talking a lot about companies that manage already existing APIs. But what about organizations that need to create APIs?
A few weeks ago, I received an email from the CEO of Datownia wanting to give me a small gift to chat with him about what I was doing with their technology.
Of course as an analyst I can’t accept any gifts, but I had a great conversation with William Lovegrove about the technology and where the idea came from.
From one-offs to a SaaS
Basically William’s little consulting firm was busy building and evangelizing APIs to organizations. When a company was confronted with making an API, often progress would screech to halt or at least be diverted while things were sorted out. Often IT departments simply could not deal with making an API for anything. Other times they would be engaged into creating a one-time API for a company.
Complicated, expensive and not very efficient.
Datownia then came up with the idea of building a service in the cloud that automates the process of building and API.
I think this is brilliant.
If you need ana API, or just want to play with a prototype, you should take a look at how simple this is.
Thanks William Lovegrove and crew.
25.01.2013 by Craig Burton
The three biggest trends impacting computing today are what I call the Computing Troika. Cloud Computing, Mobile Computing and Social Computing.
There is a fourth trend that is on par with each of the Troika movements. The API Economy.
Finally there is the question of the role of standards in these trends.
First, here is my definition of Cloud Computing—and its opposite—Non-cloud Computing.
Cloud Computing involves offering network computing services with the following three characteristics:
- IT Virtualization
- Service re-usability
IT Virtualization—Network services, including management and support, that are geographically independent. That is not to say that services are not on-premise, it just means that it doesn’t matter.
Multi-tenancy—Network services that offered to more than one tenant at a time.
Service re-usability—Network services that can be used and built upon for all tenants over and over.
All three are important. The one that needs explaining and is not so obvious is the Service Re-usability feature. AD FS (Active Directory Federated Services) integrated into WAAD (Windows Azure Active Directory) is a good example. Because it is virtual and multitenant, a single SAML instrumentation to WAAD gives permissioned SSO and integration to EVERY customer connected WAAD by default. This makes it highly leveragable and “reusable.” Further, all of the services to all tenants of WAAD have the same APIs, the same console UI, indeed, the same infrastructure from top to bottom. This lets IT departments be much more efficient and competitive.
But here is where the new rubber meets the road. WAAD is specifically designed to give the customer real freedom of choice. This is done by not trying to keep the customer captive with either architecture design or terms of service.
The architecture design moves from keeping the customer captive in a Silo with two main features.
- Standards support
- APIs for everything
Standards support is the traditional bailiwick for interoperability. Interoperability is the key feature of services that are vendor independent. But standards are not a panacea. Standards movement is slow. Further, it is a myth that standards compliance guarantees interoperability. One vendor’s standard floor is another vendor’s standard ceiling. To remain competitive, vendors tend to tweak the standard game—sometimes in excess—to maintain an advantage.
The new equalizer—for both the customer and the vendor—is the API Economy. By providing open simple API access to everything, a vendor can still differentiate and yet offer real freedom of choice to its services. With complete APIs infrastructure, services are no longer Silos. Any customer or competitor can duplicate or extend the Apps that use the services (such as an admin console, user portal or developer portal) without repercussion.
“Non-cloud” then becomes any architecture design that does not include all three of these features. Note that this puts even more importance on the API Economy. The IT computing silo prison can only be broken through an active API Economy. The key to the successful customer-centric product design is giving the customer Freedom of Choice. Freedom of Choice is not freedom of captor. Freedom of Choice must be vendor independent. Independence can only be gained thru the API Economy coupled with traditional standards process.
There is more than one type of standard.
The three main types are:
- de Facto
- de Jure
- de Riguer
De facto—is the Latin definition “by default”. TCP/IP is actually a de facto standard. It was declared by governments that the standard network protocol would be the de jure OSI standard. As we all know, OSI never happened. TCP/IP is the de facto standard of the internet.
De Jure—is the Latin definition “by jury” or committee. HTTP is a de jure standard by the WC3.
De Riguer—is French for standard by current fashion. De Riguer goes far beyond “fashion”. Both de facto and de jure standards are very slow moving. In fact, a de jure standard is—except for governments— obsolete and certainly out of fashion by the time the committee ratifies the standard.
I bring up the distinction of these three standards because I think that what we can expect is to see a rapid proliferation of de Riguer standards that are built on top both de Facto and de Jure standards that are highly usable and can be referred to and used as “standards” that can provide interoperability without either a laborious and expensive de Jure process or the expectation of an accidental de Facto crown.
For example, the use and creating of “Graph API” design and methods as we see in the Facebook and WAAD API design are going to become standards independently of any committee. Of course the thought of this kind of talk scares a lot of people to death because of the kind of crazy behavior we see from vendors like Twitter and what it has done to its developers and its API.
But it is my opinion that when vendors act in such irresponsible ways they do so at their own peril I believe in the long term that we can successfully lean on rational thought and behavior that will support a strong three standard ecosystem that works.
28.11.2012 by Craig Burton
Starting at the EIC 2012 I have been talking and presenting a lot about The API Economy. The API Economy has become a strategic topic for organizations. As one can expect with a hot topic, there are many opinions and views on the matter. Therefore there a many comments, blog posts and articles written about The API Economy.
Needless to say it is tough to keep track of everything being said or to track any given thread. I should start off by saying the questions asked by this blog post are appropriate and need to be answered.
The DataBanker thread
An interesting thread that I have been following for a while has inspired me to make a few comments about exactly what I mean by an API and to add additional data about the actual API numbers.
The people over at DataBanker published a piece in Sept. entitled “Personal Identity in the Cloud. What’s a Programmer to Do?
The author then goes on to cite the numbers I have used in several presentations to derive the actual number of APIs that we are looking at dealing with over the next five years. First he questions the accuracy of the numbers and their implications.
“I have to admit, the statistics from the Apple announcement, especially when combined with the view from Cisco, definitely make one stop and think. But Craig Burton’s blog post has apocalyptic overtones that I don’t think are accurate.”
Next he starts to ask questions about what I actually mean when referring and API.
“When Craig Burton refers to “20+ billion APIs all needing distinct identities”, what I believe he is actually referring to is interconnections and not discrete APIs.”
And finally the author states that the Identity Ecosystem being established by NSTIC will be used to address the problems brought on by The API Economy.
“Managing identity – entity or personal – within the Cloud certainly has some unique challenges. Fortunately, there are substantial communities such as the NSTIC Identity Ecosystem and projectVRM that are focused on defining standards for creating, validating, managing, and transacting trusted identities as well as looking at the broader issue of how individuals can control and assert their identity and preferences when engaging products, services, and vendors within this expanding internet of things. Multiple solutions will likely be proposed, developed, will co-exist, and eventually consolidate based on the collective wisdom and adoption of the cloud community. That community – or ecosystem – is really the key.”
So let me address each of these in turn.
The Apple and Cisco numbers and their apocalyptic overtones
First off, let me say that the numbers I quote from the iPhone5 announcement — while a little overwhelming — are very conservative. Mary Meeker — partner with Kleiner Perkins, Caufield and Byers — recently gave a talk about the growth of the device market. In that talk, she pointed out that the Android Phone is ramping up 6 times faster than the iPhone.
“By the end of 2013, Meeker expects there to be 160 million Android devices, 100 million Windows devices, and 80 million iOS devices shipped per quarter.”
If you can believe the first axiom of the The API Economy — Everything and Everyone will be API Enabled — the significance of this additional research on the numbers of devices being shipped is non-trivial. The current methods being used to provision and manage the identities associated with these devices are broken and cannot scale to address the issue. Call that Apocalyptic if you want, but ignoring the facts do not make them go away.
Interconnections not APIs
As I pointed out earlier DataBanker then supposes that what I mean 26+ billion APIs is referring to “interconnections and not discrete APIs.”
I am actually referring to a conservative number of discrete APIs. Here is how APIs work. Every API must have a unique identity. Not necessarily unique functionality, but a unique ID.
But DataBanker did find the missing information in my numbers. I didn’t include relationships and interconnections. I didn’t include them in the equation because I wanted to keep things somewhat simple. Fact is, each interconnection and relationship also needs an API and a unique ID. Thus the number of actual APIs we are looking at are 3 to 5 times bigger than the numbers I outlined originally.
NSTIC Identity Ecosystem will address the problem — NOT
Here is where DataBanker and I start to agree — at least sort of.
It will take a community to address The API Economy explosion in identities management requirements. Further the NSTIC and ProjectVRM communities can help, but neither of these in their current state address the matter. For more information about what NSTIC is in this context, read this blog post.
The Ecosystem required to address billions of Identities and APIs is one that can be automated. Programmed. In order to address a programmable web, we need a programmable ecosystem to accompany it.
We are calling this ecosystem Identity Management as a Service.
I continue to stand by my numbers and projections of the implications being brought on by the API Economy. I see that in the near future, everything and everyone will be API enabled.
I also see a great number of people and organizations that do understand this issue and are moving forward with intention to address it and to succeed with the API Economy.
17.10.2012 by Craig Burton
The Intersection of Policies, Standards & Best Practices for Robust Public Sector Cloud Deployments
Last week I was invited to attend the 2012 International Oasis Cloud Symposium.
I was very impressed. The attendance was not large—in fact—the organizers limited the number of attendees to 125 people. I was not able to attend the first day, but the second day was lively with many interesting presentations and discussions.
I won’t go over the complete agenda, if you want to it can be located in PDF format here.
Overall I would say every presentation given was worth listening to and the information was both valuable and informative. Not all of the presentations have been posted yet but a good number of them—including mine—can be found at this location.
I wanted to highlight a few of the presentations that were especially interesting. Again, I think all of them are worth looking at, but here are some highlights.
Privacy by Design
The day started out with the Information and Privacy Commissioner of Ontario Canada—Dr. Ann Cavoukian—giving a presentation via videoto the group on Privacy by Design. Her message was that she and Dr. Dawn Jutla—more about Dr. Jutla in a second—are co-chairing a technical committee on Privacy by Design for software Engineers.
“It’s all about developing code samples and documentation for software engineers and coders to embed privacy by design into technology. We are going to drill down into the “how to” in our technical committee.”
Following the video by Dr. Cavoukian, Dr. Dawn Jutla gave a presentation about Privacy by Design (PbD).
Now I have heard of Dr. Cavoukian and the PbD movement. But I had never been exposed to any details. The details were amazing and I like the 7 Foundational Principles.
1. Proactive not Reactive; Preventative not Remedial
2. Privacy as the Default Setting
3. Privacy Embedded into Design
4. Full Functionality—Positive-Sum, not Zero-Sum
5. End-to-End Security—Full Lifecycle Protection
6. Visibility and Transparency—Keep it Open
7. Respect for User Privacy—Keep it User-centric
These are sound principles that make a lot of sense. So much so that I invited Dr. Jutla to attend the Internet Identity Workshop (IIW) and to jointly present with me a discussion about Privacy and Identity in an API Economy.
Dr. Jutla agreed and we will lead the discussion on both Tuesday and Wednesday of next week (October 23, 24) at IIW.
If you look at the agenda, the rest of the speakers presenting on privacy were stellar. I learned a lot.
I strongly recommend looking over the agenda and reviewing the presentations that interest you. For most organizations, this should be every plenary and every discussion group.
I was also impressed with the Oasis’ ability and willingness to invite seemingly competitive groups, like iso.org, ANSI, and Kantara. This is the way standards body should work when it has the best interest of the industry and objective of open standardization.
Kudos to Laurent Liscia and the entire OASIS organization for the execution of a great event.
24.09.2012 by Craig Burton
National Institute of Standards and Technology awards $9M to support trusted identity initiative
On September 20, 2012, the National Institute of Standards and Technology (NIST) announced more than 9 million USD dollars of grant awards in support of the National Strategy for Trusted Identities in Cyberspace (NSTIC).
The grants were awarded to five consortiums. All of the big. All of them representing different views and technologies with strong focus on identity, security, and trust.
While many identity and security professionals are familiar with the Obama administrations NSTIC program, many international professionals are not. In order to address all of KuppingerCole’s constituents, some background information is useful.
The impetus for the NSTIC policy move by the Obama Administration is part of the Cyberspace Policy Review published in June 2009. The administration appointed Howard Schmidtin a new Cyber Security Coordinator position. Schmidt is a well-known security expert and is experienced in international security policies and technologies.
On Tuesday, December 22, 2009, Schmidt was named as the United States’ top computer security advisor to President Barack Obama. Previously, Schmidt served as a cyber-adviser in President George W. Bush’s White House and has served as chief security strategist for the US CERT Partners Program for the National Cyber Security Division through Carnegie Mellon University, in support of the Department of Homeland Security. He has served as vice president and chief information security officer and chief security strategist for eBay.
Prior to joining the Obama Administration, Schmidt served as President of the Information Security Forum and President and CEO of R & H Security Consulting LLC, which he founded in May 2005.He was also the international president of the Information Systems Security Association and a board member of the Finnish security company Codenomicon, the American security company Fortify Software, and the International Information Systems Security Certification Consortium,commonly known as (ISC)². In October 2008 he was named one of the 50 most influential people in business IT by readers and editors of Baseline Magazine.
Under Schmidt’s direction and managed by NIST, the first draft of NSTIC was published in draft form in June of 2010. The draft received much criticism for the lack of privacy protection for individuals and the size of the role played by the government. A final draft was rewritten and published in May of 2011. In the final draft, the role of the government was reduced and privacy issues were addressed.
The stated objectives of the NSTIC initiative are:
NSTIC is a White House initiative to work collaboratively with the private sector, advocacy groups and public-sector agencies. The selected pilot proposals advance the NSTIC vision that individuals and organizations adopt secure, efficient, easy-to-use, and interoperable identity credentials to access online services in a way that promotes confidence, privacy, choice and innovation.
“Increasing confidence in online transactions fosters innovation and economic growth,” said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher. “These investments in the development of identity solutions will help protect our citizens from identity theft and other types of fraud, while helping our businesses, especially small businesses, reduce their costs.”
NSTIC envisions an “Identity Ecosystem” in which technologies, policies and consensus-based standards support greater trust and security when individuals, businesses and other organizations conduct sensitive transactions online.
The pilots span multiple sectors, including health care, online media, retail, banking, higher education, and state and local government and will test and demonstrate new solutions, models or frameworks that do not exist in the marketplace today.
As expected, NIST picked big consortiums with big ideas for identity and trust across a broad spectrum on technologies and market segments. Here is what the basics are about its choices for the consortiums:
“These five pilots take the vision and principles embodied in the NSTIC and translate them directly into solutions that will be deployed into the marketplace,” said Jeremy Grant, senior executive advisor for identity management and head of the NSTIC National Program Office, which is led by NIST. “By clearly aligning with core NSTIC guiding principles and directly addressing known barriers to the adoption of the Identity Ecosystem, the pilot projects will both promote innovation in online identity management and inform the important work of the Identity Ecosystem Steering Group.”
The grantees of the pilot awards are:
The American Association of Motor Vehicle Administrators (AAMVA) (Va.): $1,621,803
AAMVA will lead a consortium of private industry and government partners to implement and pilot the Cross Sector Digital Identity Initiative (CSDII). The goal of this initiative is to produce a secure online identity ecosystem that will lead to safer transactions by enhancing privacy and reducing the risk of fraud in online commerce. In addition to AAMVA, the CSDII pilot participants include the Commonwealth of Virginia Department of Motor Vehicles, Biometric Signature ID, CA Technologies, Microsoft and AT&T.
Criterion Systems (Va.): $1,977,732
The Criterion pilot will allow consumers to selectively share shopping and other preferences and information to both reduce fraud and enhance the user experience. It will enable convenient, secure and privacy-enhancing online transactions for consumers, including access to Web services from leading identity service providers; seller login to online auction services; access to financial services at Broadridge; improved supply chain management at General Electric; and first-response management at various government agencies and health care service providers. The Criterion team includes ID/DataWeb, AOL Corp., LexisNexis®, Risk Solutions, Experian, Ping Identity Corp., CA Technologies, PacificEast, Wave Systems Corp., Internet2 Consortium/In-Common Federation, and Fixmo Inc.
Daon, Inc. (Va.): $1,821,520
The Daon pilot will demonstrate how senior citizens and all consumers can benefit from a digitally connected, consumer friendly Identity Ecosystem that enables consistent, trusted interactions with multiple parties online that will reduce fraud and enhance privacy. The pilot will employ user-friendly identity solutions that leverage smart mobile devices (smartphones/tablets) to maximize consumer choice and usability. Pilot team members include AARP, PayPal, Purdue University, and the American Association of Airport Executives.
Resilient Network Systems, Inc. (Calif.): $1,999,371
The Resilient pilot seeks to demonstrate that sensitive health and education transactions on the Internet can earn patient and parent trust by using a Trust Network built around privacy-enhancing encryption technology to provide secure, multifactor, on-demand identity proofing and authentication across multiple sectors. Resilient will partner with the American Medical Association, Aetna, the American College of Cardiology, ActiveHealth Management, Medicity, LexisNexis, NaviNet, the San Diego Beacon eHealth Community, Gorge Health Connect, the Kantara Initiative, and the National eHealth Collaborative.
In the education sector, Resilient will demonstrate secure Family Educational Rights and Privacy Act (FERPA) and Children’s Online Privacy Protection Act (COPPA)-compliant access to online learning for children. Resilient will partner with the National Laboratory for Education Transformation, LexisNexis, Neustar, Knowledge Factor, Authentify Inc., Riverside Unified School District, Santa Cruz County Office of Education, and the Kantara Initiative to provide secure, but privacy-enhancing verification of children, parents, teachers and staff, as well as verification of parent-child relationships.
University Corporation for Advanced Internet Development (UCAID) (Mich.): $1,840,263
UCAID, known publicly as Internet2, intends to build a consistent and robust privacy infrastructure through common attributes; user-effective privacy managers; anonymous credentials; and Internet2′s InCommon Identity Federation service; and to encourage the use of multifactor authentication and other technologies. Internet2′s partners include the Carnegie Mellon and Brown University computer science departments, University of Texas, the Massachusetts Institute of Technology, and the University of Utah. The intent is for the research and education community to create tools to help individuals preserve privacy and a scalable privacy infrastructure that can serve a broader community, and add value to the nation’s identity ecosystem.
High Level Analysis
In terms of government initiatives, NSTIC has been moving at lightning speed. Jeremy Grant has been a proactive advocate of the initiative and is articulate and capable leader. It shows from the choices of these consortiums and their constituents.
At the same time—9 million dollars spread across five initiatives; each with many mouths to feed—does not go very far and can be used up very quickly. It will be interesting to see how far each will proceed over the next twelve months. I chose 12 months because I can’t see how the money awarded to each group will last much longer than that.
Each group will need to put a plan together and execute in that time frame if they are to survive.
Over the next short period, we will take a closer look at each initiative, what their respective architectures look like, and what the specific objectives are in their roles in the identity ecosystem outlined my NIST.
Of course, I will be paying special attention to what each consortium has planned as an API Economy strategy. Each will need to have a solid API design that gives all of the other groups API access to all of the services through both the Web Services legacy (SOAP) and the emerging API Economy imperative (RESTful).
If each group does not have a solid SOAP/RESTful API strategy, they simply will not succeed—either individually or as a whole.
I know it sounds strange coming from me that an organization should continue embracing the SOAP legacy, but there are just too many government and non-profit organizations that cannot afford to jump to the real world quickly and must continue carrying the burden of the past. So it is sometimes.
Of course there are many more issues involved with success of this initiative beyond APIs, these issues will be covered more in depth in subsequent KuppingerCole reports and activities at the EIC Conference in May 2013.
Nonetheless, we see this movement by the NIST of granting these award as positive and will have reverberating impact on the Identity community—across the glove—for the good for some time to come.