It became pretty quiet around directory services during the last years. When I remember the discussions back some 10, 15 or 20 years around NDS versus LAN Manager (and the underlying domain approach) or Active Directory when it came to market, and even the discussions which came up in the early days of OpenLDAP, it is pretty quiet nowadays. Are all the problems solved? Are the right directories in place? Are the best solutions chosen when something changes?
When talking with end user organizations it becomes obvious that we are far away from that state. There are implementations of different directories, and most of them work well for their specific use case. But once it comes to optimization, the situation changes. What to put in the Active Directory, what not? How to optimize the way applications are dealing with directories? How to best build a corporate directory or a meta directory (the directory as data store, not the meta directory service as technology for synchronization!)? How to interface directories for specific use cases and how to best retrieve information?
There are many aspects to discuss and to understand to end up with an optimized “directory infrastructure”. First of all, it is important to understand which directories you have and how they are used – usually there are far more directories out there than you’d expect. And I’m not only talking about the Active Directory, eDirectory and all the LDAP servers, but as well about “de facto” directories in the form of tables in databases and so on. I’m talking about anything which acts as a directory. That includes the application directories, which might be hundreds of small directories. And they sometimes contain sensitive information like privacy-relevant data. Besides this, they frequently have somewhat redundant data. Based on this analysis, you can drill down and identify which attributes have to flow between which directories in which use cases.
The latter is more about really optimizing your provisioning. The analysis is, on the other hand, as well a good foundation for optimizing your directory infrastructure. Where can you avoid redundancy?
Based on such an overview, you can think about some other aspects:
- Which central directories do you need for which use cases?
- How to optimize application access on directories?
- Where do you need specific technology for these directories beyond standard LDAP?
There is always a need for some more or less central directories. The Active Directory or eDirectory are examples, used for the primary authentication of internal users and for many infrastructure services – but they can’t do anything. There are Corporate Directories for centralized access to corporate information. There are more technical meta directories as the “source of truth” about distributed information.
We have to think about optimizing the application directories. One or few centralized directories together with Virtual Directory Services which are offered for example by Radiant Logic, Oracle, and Symlabs are an interesting option do build such a centralized yet flexible infrastructure, with the Virtual Directory Service as interface layer.
And we have to look at specific use cases where we need specialized technology. There are some innovative vendors out there. UnboundID for high scalable environments, where others like Oracle, Novell, Siemens, and so on are active as well. eNitiatives with their ViewDS services for strong querying capabilities and the ability to easily build interfaces in a “yellow page” style to these directories.
My experience is, that there is still a lot of need to think about directory services – and there is a lot room for improvement in most IT environments. What is your view on that topic?