Simplifying XACML – the Axiomatics ALFA plugin for Eclipse IDE

14.08.2012 by Martin Kuppinger

Axiomatics, a leading vendor in the market of Dynamic Authorization Management systems – sometimes called either Entitlement Management or Policy servers – has recently released a new tool called the ALFA plugin for Eclipse IDE. ALFA stands for “Axiomatics Language for Authorization”.

With that tool Axiomatics allows developers authoring XACML 3.0 policies in the widely used Eclipse environment using a syntax which is close to commonly used programming languages like Java or C#.

This is a pretty nice tool which closes a gap around XACML development. Instead of having programmers creating XACML policies in XACML by hand or using the administrative tools to create the policies with drag&drop approaches, they can create policies the way they are used to – in an environment they are used to.

And ALFA is far less complex to use than native XACML. There is a nice video available which demonstrates creating a 107 line XACML file with 19 lines of ALFA code (pseudocode, to be correct). Some other interesting features are the support for Eclipse functionality such as syntax checking and auto-complete to easily create policies.

Axiomatics again, as with their Axiomatics Reverse Query (ARQ), proves that they are thought leaders around XACML. They address one of the challenges around the use of XACML and Dynamic Authorization Management in general with that tool.

You might raise the question whether the developer really should create XACML policies. The answer is: It depends. In an ideal world, these policies are defined by business in a way that completely hides the underlying policy language like XACML. But there are use cases where developers might create the policies, especially for point solutions. And many of today’s projects are developer-centric and targeted at specific use cases. So there is a clear value ALFA provides. But it needs all of the above: ALFA for developers, tools for administrators with some XACML knowledge, and simple tools for the business integrated with approval workflows and into the overall policy and access management approaches.

For developers, there is the need for having approaches like ALFA. That’s one important piece in making Dynamic Authorization Management easier to implement and use. The other piece is PEPs (Policy Enforcement Points) which allow relying on XACML policies without knowing anything about XACML. So ideally a request to a Dynamic Authorization Management system is little more than a line of code calling a method but should be fully transparent regarding the backend.

Axiomatics is making good progress in making XACML easy (i.e. transparent) to use – by improving user interfaces, by more PEPs out-of-the-box, by ALFA. That is the right approach, I think.


Posted in Dynamic Authorization Management | Comments Off

The best product for IdM?

13.08.2012 by Martin Kuppinger

A recent discussion in the LinkedIn group “Identity Management Specialists Group” asked for the personal opinion about what is the best IdM product out there. Besides the fact that it listed only five products to choose from in a survey, this question, from my perspective, is the wrong question. If I just take the question, my answer would simply be: “None”. There is no “best product” in that market. There is only the product best suited to solve the customer’s problem. And by the way: What is IdM? OK, this is an abbreviation for “Identity Management”, which is better understood as Identity and Access Management, given that access is a bigger issue than identity. I don’t say that identity is a small challenge, but at the end of the day, business mainly cares about access.

Within the discipline of IAM we have a pretty broad range of different market segments, including Identity Provisioning, Access Governance, Access Management and Federation, Privilege Management, Enterprise Single Sign-On, and several others. IdM or IAM definitely is more than just Identity Provisioning. But to understand which technical building blocks a customer really needs, you need to understand his challenges. What is he really looking for? So it again comes down to: There is no best product, there is only the product (or set of products) which fits to the needs of the customer.

But then another aspect comes in: IAM is not really a technical issue. So raising the question for the best product ignores the fact that IAM mainly is about organization, about guidelines and policies, and about processes. Without having them defined you neither have the criteria for choosing a product nor a chance for a successful IAM initiative. You might “successfully” deploy a product, but it is about successfully implementing IAM processes in the organization. Simply said: technology follows organization.

On the other hand, if you have properly defined your organization, guidelines, policies, and processes, you will observe that most likely no product will meet all of your criteria out-of-the-box but several products will be able to serve your needs. So the relevance of “the best product” diminishes. There are products which just don’t fit your requirements. But most likely there will be some that will fit. In those cases the decisions might be much more about trust in a vendor and its capability and willingness to support your organization in implementing the product the way you want to have it then it will be about technical capabilities of a product.

So even if there were a best product, your implementation of it might fail because the product doesn’t fit to your requirements. My most important advice thus is: Understand your requirements. Define the organizational “framework” around them. And then pick the product(s) and ensure that implementation follows your specifications. Then you will most likely succeed. When just looking for technology, you might succeed in deploying technology, but chances are high that you fail in implementing IAM in your organization.


What will it mean when Windows operating systems reject encryption keys smaller than 1024 bit soon?

10.08.2012 by Martin Kuppinger

Microsoft will soon release an update to its current operating systems (Windows XP and higher; Windows Server 2003 and higher), which will block the use of cryptographic keys that are less than 1024 bits in length. This announcement was made quite a while ago, but most links go to a rather specialized place, the “Windows PKI blog”. And honestly, who besides some geeks are really reading such a blog?

The consequence is that certificates with key lengths of 512 bits will be blocked, leading to error messages. These errors can occur when browsing the web, when trying to enroll certificates, when creating or consuming S/MIME secure eMail, when installing ActiveX controls, or when installing applications. Most things will work smoothly, but some legacy components and applications might fail.

That might cause some trouble in organizations once the update – which clearly makes sense from a security perspective – is deployed. Unfortunately, it is not that easy to handle this issue. Microsoft’s approach described in the blog post mentioned above is not what I’d call straightforward. There is a lot of valuable information on how to deal with that issue, but it requires a lot of administrative work.

However, some vendors like Entrust and Venafi offer solutions to discover certificates used across your network. Both are tools that provide you a sort of “Enterprise Certificate Management” as part of the Enterprise Key Management initiatives you should have running anyway. If you haven’t started with such an initiative, it is long past the time to do so – EKM/ECM makes a lot of sense for discovering, managing, and protecting all your certificates and keys across the enterprise. More at the lower end of the set of available tools you find the Qualys SSL Labs SSL Server Test, which allows you to run an in-depth analysis of SSL keys used by publicly available websites. That at least might provide some information for troubleshooting.

The reason behind this all is simple: Certificates with a key length of 512 bit have been successfully cracked. This is related to the Flame malware, the reason why Microsoft finally decided to block the 512 bit keys. Some information about the relationship between Flame and the Microsoft security update are found in the Microsoft blog post mentioned above.

A question that could be raised is whether 1024 bit key lengths then will be sufficient or whether we will face the next update soon. An important fact is that encryption strength is exponential to the key length, due to the algorithms used. So it is not about just doubling computing power. However, there is some likelihood that we will see larger algorithms being cracked over time. That requires a lot of knowledge and computing power because there is no simple algorithm known yet. There might be one (which would make virtually all of today’s security useless) but most security experts doubt that. So we will have to wait and see. In the meantime, you should try to get a better grip on all the keys and certificates used in your organization – that at least will allow you to react quicker and with less work on the Microsoft update and future changes in that area.


Posted in Uncategorized | Comments Off

Doing BYOD right – it’s all about information security

03.08.2012 by Martin Kuppinger

A recent article in Network World online  had the title “For BYOD Best Practices, Secure Data, Not Devices”. I fully agree with that title. However when reading it I struggled somewhat with the solutions proposed therein, which were mainly about “mobile device virtualization” and MAM (Mobile Application Management) instead of classical MDM (Mobile Device Management). However, neither mobile device virtualization (we might call this MDV) nor MAM really are about securing data. OK, MAM as proposed by companies like Apperian at least also can protect the communication channel and the storage used by apps. However, the main focus of MAM is in controlling the apps which can be used to access corporate data.

That is neither fundamentally new nor does it solve all the problems in that area. What about access to corporate data like eMail using the standard apps? How do you deal with web access? You still might need to create new apps which are more secure than standard apps. And when not supporting standard apps, you might struggle with acceptance issues.

No doubt, MAM brings value. MDM as well brings value. And other approaches like the one of Enterasys which even has trademarked the claim “BYOD done right” for their Mobile IAM solution also bring some value. Enterasys focuses on a network security solution which controls access of devices and what they are allowed to do, including the access to some applications. But also here there are several aspects which aren’t solved – starting with the access of users to cloud services which do not even touch the network and thus never are seen by the Enterasys solution.

Several shortcomings might be addressed by configuring apps, cloud services, and so on. However, the more you limit the higher the risk that users won’t accept the solution, besides all the legal issues of doing things at the devices. I particularly like the idea of MDV with providing an image of a mobile device on another mobile device. So your corporate apps are running in a separate environment, which is under better control. However: Will these environments be more secure or will they just duplicate shortcomings like the ones of iOS and iOS apps? Nevertheless, running corporate apps in virtualized, controlled environments is an interesting approach. But if the user still wants to use the familiar Mail app on iOS, you are again reaching the limits.

Unfortunately the (close to) ideal solution, Information/Enterprise Rights Management for mobile devices, is not there yet. But even there you end up with the risk of malicious apps leaking data – IRM assumes that applications are handling information correctly.

What is the conclusion? There is virtually no way not to accept BYOD as a reality. There is no perfect solution for secure BYOD. You need to understand the risks for corporate information when they are accessed by different classes of devices. And you need then to find adequate ways for protection – from open access to prohibiting mobile access at all. In between, there is place for the different types of solutions mentioned as well as some others. You most likely will need a mix of security approaches for your BYOD world because there isn’t a perfect solution out there – even when several vendors promise that they have found the holy grail of BYOD security. Be assured: No one has until now. So: Understand your risks. Identify an appropriate set of technologies which help you to mitigate risks. Define and enforce policies. And do it in a way which allows users to do a lot, so that they can understand that some things are forbidden or only allowed when specific security measures are in place – like MDM, like using a specialized app, like virtualization.


Posted in BYOD | Comments Off
© 2015 Martin Kuppinger, KuppingerCole