Wednesday, November 22

Single Sign-On; Multiple Confusion Points

The Identity Management landscape is littered with various products. It's already difficult to understand IdM product categories and how they work together. But it's made worse because various vendors implement their technologies differently. That's not necessarily a bad thing in itself, but customers are confused.

The problem is analogous to that of Microsoft's Internet Information Services (IIS). IIS is both a web server and an application server. So, in a chart, it would line up against the Apache and Sun web servers. But, it would also line up against Tomcat, WebSphere and WebLogic in the application server category. This same phenomenon occurs often with Identity Management products. Some products have single capability and others have multiple. Some companies offer a single product and others offer multiple. So, it becomes very difficult to compare apples to apples.

In protest of all this confusion, I continue to attempt to simplify the IdM landscape. A while ago, I tried to create an identity services architecture map that could be used as a visual aide when communicating about where products fit in to an overall IdM architecture. While not 100% satisfied with it, I do think it serves the basic purpose.

Lately though, I've been re-thinking Web SSO and Enterprise SSO. What I've found is that people tend to group them together. However, at their core these are two very different technologies.

Web SSO is about securing access. It's generally implemented as a web server filter that intercepts access requests and makes decisions about whether to grant or deny access to the requested resource for the requesting party. Strong Web SSO products typically include a mechanism to manage web resources and user access permissions.

ESSO, on the other hand, is about enabling a better end-user experience. Typically, ESSO is implemented as a desktop client that manages user credentials on behalf of a single user. There are some ESSO implementations that act as an access proxy somewhere on the network, but this functionality isn't core to ESSO capabilities. The main driver of ESSO tends to be end-user experience and not security. In fact, I often hear the ESSO question from customers posed something like this:

We want ESSO to make users' lives easier, but we're afraid of the security implications. If a password is compromised, an attacker would now have access to the entire kingdom.
My answer is that this problem isn't with ESSO -- it's a problem with the authentication mechanism. Two-factor authentication would prevent an attacker from accessing anything with just a password compromise. And ESSO enables strong password policies on applications throughout the enterprise, which means passwords are much less likely to become compromised in the first place. If users don't ever need to remember or manually enter their passwords into systems, you could reasonably require 10 character passwords which require upper, lower, numeric and special characters. And you can force a password change every 30 days (this can also be transparent to the end-user). This creates a password that's much more difficult to guess or brute-force attack than a typical user's 6- or 8- digit alpha-numeric password.

So, I guess I'm thinking that I might move ESSO out of the Access Services layer and into the User Services layer. It's really not doing any enforcement and it's not managing policies or access rights. It's really only fostering a better user experience in a secure way.

And let's just all start ignoring the term Reduced SSO. This term just acknowledges that not every enterprise system can participate in an SSO implementation. Is this really necessary to distinguish? We're looking to achieve single sign on for participating applications. Let's not confuse customers further with an additional term that only points out what should already be obvious. If we can make this all simpler, I think we'll see higher adoption (and success) rates and happier customers.

Tuesday, October 24

Convergence of Physical and Logical Security

In my very first blog entry back in February, I wrote briefly about the convergence of physical and logical security. I didn't know much about the topic at the time, but I had a feeling that it was important. The entry is unfortunately a fairly uninteresting and un-informative piece of writing, but it seems to continuously generate a significant amount of activity to my blog. And it's not just me - I'm starting to see articles pop up everywhere that the security discussion is taking place. It's a hot topic.

Throughout 2006, it has become more and more obvious that this convergence is a vital part of securing the enterprise. Since joining RSA on the first of this month, the number of convergence conversations I've been a part of has definitely increased. RSA offers smart cards and card management software that enable organizations to deploy a single authenticator that stores multiple credential sets for use across both physical and logical security. So, that's probably why I'm hearing more about it.

If you're interested in learning more about RSA's offering, take a look at this webinar: The Future of Authentication

I unfortunately don't have any new insights to offer on this topic, but I thought it was simply worth saying that if you're responsible for securing an organization, you ought to be thinking about this. And to put together some info on the topic since people seem to be coming here for info.

A few related articles:
Happy converging...

Friday, October 6

First Week at RSA

I had an interesting and busy first week at RSA. It's no surprise that I met some extremely bright people. I spent my first few days in Phoenix working with an internal team and managed to speak with a few customers as the week progressed. Some of the very cool ideas I've already heard include:

  • Providing Network Access Control using machine certificates. The idea here is that you can't plug in a machine without a proper cert and gain access to the network. RSA has certificate management software that makes this solution a reality. The cert can be based on a specific hardware profile so getting your hands on the cert won't help. It's simple and effective.
  • Risk-Based access control or what RSA calls Adaptive Authentication. This is about adding an additional dimension to the authentication process. Not just what you have and what you know, but where are you right now? Or from which device are you attempting to gain access?
  • The business value of implementing Federation as a way to reduce bandwidth on the LAN. It never even occurred to me until one of my new colleagues pointed it out. Why tie up your global WAN with unnecessary packets (and spend your budget on increasing infrastructure) when you can leverage the web to pass access rights to overseas applications using a simple Federation solution?
  • RSA also has a nice key management utility for organizations that need to build encryption into software solutions but don't want to assume the burden of: 1) designing a secure encryption solution. 2) securing the encryption keys for use by the solution. Or worse yet 3) managing the on-going key life cycle. Keys can be shared amongst applications and re-generated on a schedule to reduce the risk of the keys being compromised.

Needless to say, I'm already getting very busy. I have a lot to do and I have to say I'm invigorated by the new challenges. ...until next time.

Friday, September 22

TBG's take on EMC + RSA

On The Burton Group's new identity blog, they recently weighed in on EMC's purchase of RSA. I think it's a little more insightful than the other coverage I've seen (albeit at a high level):

Why did EMC acquire RSA Security?

More Info:

Time to rethink your IdM strategy?

"The unexamined life is not worth living"
- Socrates

"The unexamined Identity Management solution is not worth the investment"
- Me

While not quite as eloquent as the original, my point is that we should apply Socrates' theory to business and IT strategy just as we apply it to the way we live our lives. I gave a nod to a blog post a while back that described Identity Management as a lifestyle. Yesterday, in an article entitled CIO Jury: Businesses face ID management headache, the CIO Jury reminds us that identity management is a lifestyle. You can't set it and forget it like Ron Popeil. You need to make a plan, start to build, re-evaluate, launch some functionality, re-evaluate, build some more, alter the plan, evaluate what's coming next, etc.. There are certainly technology components that will run on their own without constant maintenance, but they live at a very tactical level. At the strategic level, it's important to continually re-think the plan. The business climate as well as the technology landscape are both dynamic in nature. Change is constant. Keep this in mind during planning and be prepared to switch gears if needed.

In a recent MaXware webinar, a customer described the process of backing out of a big investment in IdM and starting again with an entirely new product set. The webinar will be available on the MaXware site and it's definitely worth the viewing time. One important take-away for me was that you can forfeit a major investment that's not working, completely shift gears and come out successful. It takes some guts and some ingenuity, but it can pay off. The webinar presented an excellent example of the value of constant re-evaluation.

Another point I want to make is that while planning an Identity services architecture, it's vital to build an environment that is flexible and adaptable. Take a look at Identicentric's idBUS product. They've got the right idea. Build a flexible service-oriented middleware layer that enables you to quickly adapt the front end apps or the back end infrastructure to the ever-changing business requirements. I've also talked in the past about Virtual Directory as a data abstraction layer providing similar capability but at a data access level rather than an application access level. Think about these approaches and how else you might enable your Identity Services infrastructure to adapt quickly to changing requirements.

And rethink your Identity Management strategy. It's always a good time for that.

Tuesday, September 19

The Keys to Successful IdM Deployments

In case you missed Digital ID World this year and didn't get to hear his presentation, Shawn Ellis, Director Identity and Access Management at Raymond James Financial, will share his enterprise identity management story tomorrow at 11am PST/2pm EST.

Click here to view the Webinar invitation and enroll.

Registration is free and takes just a few minutes. ...There's always something to learn from somebody else's project experience.

Tuesday, August 22

Identity Management Software Design Guidelines

Identity Management infrastructures are large and complex. There are many moving parts and sometimes deciphering one component from the next is difficult. When you consider that there are numerous software vendors that offer different versions of each component and sometimes classify identity solutions differently from each other, the task of identity software selection can be daunting. To make the task easier, organizations should develop some design criteria by which to measure individual software components against each other. Rather than just conducting a feature and functionality comparison, companies should develop a set of architectural considerations that are important within their own organization.

With one eye focused on the move toward service orientation and the underlying premise that business agility will be a key differentiator for companies moving forward, here are a few recommendations for criteria by which to measure identity software solutions:

  • Open: The software is based on open standards rather than proprietary or closed architecture. It can run on Windows, Unix, Linux, etc.. It can be accessed via multiple incoming and outgoing protocols. It's interoperable with other like-minded solutions.

  • Extensible: The software is able to be extended. Organizations are complex and specific needs vary greatly. Identity software should be able to be extended to meet whatever requirements arise now or in the future. The solution should be extensible at multiple points and via open languages or APIs.

  • Flexible: The software can be put to use to solve multiple problems. Identity software that solves only one specific problem is limiting. Identity software needs to be flexible enough to meet multiple demands and solve numerous problems. While an application can't be all things to all people, it can keep flexibility as a core design goal so that companies can leverage the solution to achieve their maximum ability to adapt.

  • Small-Footed: The software leaves a small footprint on the IT architecture. The requirement to load numerous components and additional applications to support the deployment of a single identity solution leaves a bad impression upon the existing architecture. Identity solutions should fit seamlessly into an existing infrastructure without the need for additional software. Each new required component increases the cost and complexity of the environment - and reduced its manageability.

I thought about including performance, but I ultimately decided against it for 2 reasons: 1) it's highly subjective and 2) it will vary for almost every implementation based on architectural decisions, infrastructure and requirements. So although software makers should strive for high performance, it's difficult to measure without extensive testing.

What else is on your list of identity software design goals?

Thursday, August 10

Burton Document on Identity Data Services

The Burton Group published a new research document today:
Enabling Identity Data Services: New Developments in Identity Tooling Provide a Good Start

You can find it under: Identity and Privacy Strategies

The article touches on many of the same concepts that I've been grappling with lately related to identity data services. They discuss the value of an identity interface layer. The concepts they present are more advanced than what I've been talking about, but I think they're based on the same underlying business drivers, which is encouraging. It's good validation. And I learned a few new things. It's well worth the read if you have a Burton account. Go check it out.

Tuesday, August 8

Dave Kearns on Virtual Directory

Dave Kearns' latest newsletter states: Virtual Directory finally gains recognition. He writes:
"Virtualization is hot and a virtual directory is the building block, or foundation, you should be looking at for your next identity management project"
So, it's not just me. Thanks for getting my back Dave ;)

Monday, August 7

Starting Point for Identity Services

A while ago, I posted about the Identity Management Continuum. I want to revisit that topic from another angle. As we move toward a service-orientation and establish a more component-ized architecture approach, it becomes easier to insert, remove or swap any one particular piece of the puzzle. The continuum still exists - an Identity Services infrastructure is dynamic, should be implemented in phases and should be cyclically re-examined in accordance with business goals. But the starting point clearly varies based on the needs of the organization. For some companies, a particular set of functionality is important. They want to reduce the number of user passwords, provide self-service password recovery, provide a single sign-on service or user-enable a new application. For other companies, the goal is simply to start building an identity infrastructure. Either way, the solution design goals are the same - to implement an open and flexible identity services infrastructure so that components can be added, improved or removed as business needs change.

One solution that makes this architectural approach extremely accessible is an identity services data abstraction layer. An abstraction layer unties the bind between identity services and identity data. Identity data is typically stored in multiple locations, structures and formats throughout an organization. Mapping each identity service to all of those data repositories is a daunting effort. One way of building an abstract layer is by creating an enterprise directory that holds all of the organization's identity information. This can be effective if an organization knows all of the identity and application requirements for this uber-repository. Unfortunately, application requirements change. And data repositories change. And new services rise and fall. Due to the dynamic nature of the business environment, enterprise directories are difficult to manage and maintain. There's often a trade-off between enabling new services and just keeping up with status quo.

An alternative approach is to leverage virtual directory technology as a data abstract layer. Virtual directory technology not only effectively maps identity services to identity data to meet current requirements, it also enables the organization to rapidly adapt to changing business needs. Migrating data repositories to new formats or structures is a seamless experience for the consuming applications and services. Adding new services is easy because identity data is accessible in a single location via customizable views.

I've often heard people discuss virtual directory technology as cutting edge technology that's only adopted by advanced organizations that have well-developed identity infrastructures. I propose that we (as the IdM community) ought to be encouraging the businesses we serve to look at virtual directory technology in the earliest stages of design. Since our process is cyclical and our requirements are dynamic, it's extremely important to be able to adapt to ever-changing identity needs. Virtual directory is a no-brainer for this scenario. If implemented early enough, organizations can greatly reduce the workload of implementing and managing identity services each step of the way - say by 20% - 25%. Isn't that the portion of the effort associated with integrating identity services and applications with the underlying data?

Tuesday, August 1

MaXware Data Synchronization Engine Lite

About two months ago, MaXware announced a free product to the market for a limited time. We've gotten a very positive response and extended the available download window. Here's what you can do with it:

Synchronize Active Directory with Sun Directory Server, IBM Tivoli Directory Server, Oracle Internet Directory, Oracle Database, SQL Server, MySQL, PostGreSQL, ASCII, LDIF, DSML, etc.

Plus:

  • Maintain an audit trail of all changes at an attribute level.
  • Error handling via SNMP, Windows Event Log, email alerts and custom scripting.
  • Perform schema mapping, attribute transformation and construction.
  • Leverage an extensive built-in script library using your choice of VBScript, Javascript or Perl.

We're hoping you love it so much, you'll turn to us when your needs grow. That's it. No catch. This is a free production server license for one-to-one data synchronization with no expiration.

More info on MaxWare DSE Lite and how to get a copy >>

Sunday, July 30

Identity Services Drill Down

In my last post, I presented an overview illustration of Enterprise Identity Services. Here I'll drill down into each layer to provide some further explanation. For each layer, I'll present an overview, its interactions with other layers, current implementation landscape and emerging technologies.

Identity Services

The Identity Services industry has used many terms to describe the overall umbrella under which all (or most) of its products and services fit. Among them are Directory Services, Identity Management, and Identity & Access Management with Identity Management being the most widely used term. Since Identity Management has come to be used in many cases specifically for user provisioning and delegated admin... And since an identity platform typically provides a service out to the larger organization... And since the larger IT industry is moving toward a software-as-services model... it makes sense to me that we start using the phrase Identity Services to describe the umbrella under which our products and services fit.

The Identity Services layers are:
  • Access and Policy Services
  • User Services
  • Identity Data Services
  • Data Storage Services

Access and Policy Services

This is the layer at which policies and access permissions are enforced. As people attempt to access resources throughout an organization, this layer grants or denies access based on user identity information and access policies.

The Access and Policy services layer relies upon the identity data services and data storage layers to provide accurate and up-to-date information about users, rights and policies. The user services layer relies upon this layer to control access to its own services.

Commonly found in this layer are Web SSO and Enterprise SSO applications. But implementations vary widely. In the past, access management was handled individually by each individual application. It was difficult to manage and control. The ideal future-state scenario is to have a single centralized enterprise access control platform that every application would leverage. Most organizations today that have already begun to offer identity services are somewhere in the middle.

Emerging technologies in this layer include federation products and standards, network-based access control, user-centric solutions and standardized policy management infrastructures (XACML).

User Services

User services include software and services that enable business efficiency and functionality. Generally, services at this layer are driven by user interaction. Managers manage their employees, grant and deny access, perform attestation duties and run audit reports. System users manage their own information, reset and recover passwords and request access permissions.

The user services layer relies upon the access and policy layer to ensure that people can only perform operations to which they're entitled. It relies on the identity data services layer to achieve the proper view of the data it needs to act upon. And it obviously relies on the data storage layer to store the results of operations performed at this layer.

Typical applications found in this layer include user provisioning and deprovisioning systems, password reset and recovery applications, access management tools and audit reporting. The majority of traditional Identity Management and Identity Services projects and discussions are based in this layer.

Technology emerging in this layer includes advanced attestation reporting, improvements to provisioning, workflow, self-service and password management services and web service enablement.

Identity Data Services

The identity data services layer provides vital services to the overall identity services infrastructure. At the top two layers (access & policy services and user services), systems interact with data that resides on the data storage layer. Each system and application has its own set of requirements and its own data needs. This presents a great challenge for organizations implementing identity services. Identity data is spread out across organizations in network OS and email systems, HR systems, application databases, identity-specific directories and more. This layer provides identity services and applications access to those heterogeneous and disparate data stores in an organized and controlled manner.

The identity data services layer relies upon the access & policy layer to present valid credentials for the user, service or application requesting data access. It presents data out to the access & policy and user services layers but its primary interaction is with the data storage layer where its job is to act as a librarian of sorts. It catalogs identities across systems, joins accounts in multiple systems and presents the data in the format and structure requested by its consumer.

Metadirectories and data synchronization tools have traditionally ruled this layer. Many companies have made progress in identifying identity data throughout their organization and using synchronization products to create reliable sources for identity. Some companies create an enterprise directory that stores a common attribute set that is available to the identity services infrastructure for consumption. Others use synch tools to maintain the existing data stores. Typically, though, there are challenges associated with allowing applications and services to directly access your organizations data stores. Enter Virtual Directory technology. Virtual directories have been steadily proving their worth and will likely emerge as the dominant technology in the identity data services layer. A virtual directory can present a virtualized view of identity data out to the identity services infrastructure for consumption. The view can be customized based on the user or application making the request and can be transformed into multiple formats as appropriate for each consumption point.

Emerging technologies at the identity data services layer includes additional built-in connectors for meta- and virtual directories as well as adoption of emerging protocols and standards.

Data Storage Services

The data storage services layer is the simplest to understand. Organizations have had this layer in place for years. This layer holds the databases, directories and other data storage systems that hold identity data throughout the organization. Typically, there are many different types, structures and sizes of identity data stores in a single enterprise.

The data storage services layer may interact with each of the other layers to provide identity data as appropriate. In some cases, the data storage layer is masked by the identity data services layer so that the upper two layers do not directly interact with the data storage layer.

This layer is by far the most mature in terms of technology and adoption. LDAP directories and relational databases have proven their usefulness and are well understood amongst IT professionals. The implementations are as plentiful as the organizations that rely upon them.

Emerging technologies in this area include the adoption of new data storage standards, data virtualization (not to be confused with virtual directories), improved security & encryption and general feature & functionality improvements.

Wednesday, July 19

Enterprise Identity Services

A few months ago, I generated this view of the enterprise identity services landscape. The main goal of identity and access management solutions within an enterprise is to manage users' and applications' access and interactions with corporate assets.

There are a number of service layers that comprise the overall identity infrastructure that enable, restrict and audit this access. Each layer may be comprised of multiple technologies. As I've written in the past, Identity solutions will be implemented as services as we move into the future. Each layer should be built on open protocols that facilitate easy communication across layers. Companies should store and manage data in whatever format they're most comfortable. And employees should be empowered to perform their job functions without technical restrictions.

The layers of enterprise Identity Services as I see them are:

  • Access and Policy Services
  • User Services
  • Identity Data Services
  • Data Storage Services

Note: updated 7/25 with some minor enhancements

Tuesday, July 18

Network Layer Identity Management: Part II

I recently posted an entry about Identity Management in the Network Layer. I wanted to follow-up with some ideas about how MaXware can help companies looking to achieve access management at the network layer.

Deploying access management requires an identity store containing user and access rights information. Solutions like the one from Trusted Network Technologies can use an internal store or leverage an existing store such as Active Directory. The process of designing and creating this store is obviously going to be a big task. Based on roles, policies and other security constructs, data will need to be structured, organized, cleansed, aggregated and integrated. For many organizations, a single Active Directory will not hold the entire universe of network users. There are non-employee business associates, partners, customers, disconnected business units, recently acquired companies, etc..

MaXware Virtual Directory (MVD) can serve as an abstraction layer for this identity and authorization data. MVD can provide the access control software with a single place to look for identity and permission data. MVD can present the data as an LDAP hierarchy or in any other preferred format. Identity and access data can continue to live and be managed in its proper location regardless of format. And if you'd prefer to leverage or build an LDAP directory to hold the identity and authorization data, you may need to synchronize data into that store on regular intervals. MaXware Data Synchronization Engine (DSE) provides a robust and easy-to-implement solution for synchronizing data to and from virtually any type of repository.

Identity Management Architecture Interoperability

Neil Macehiter of MWD published a document a in May 2006 titled What drives identity management requirements. It's available on the MWD website (with registration) and worth the read. He states:

Enterprises need more than a rich portfolio of identity management functionality; they need an architectural approach which promotes interoperability.
Very true.

One of the on-going trends in IT is a move toward service-based architecture. The introductory rise of software-as-a-service has already taken place. As new applications are rolled out into organizations, IT managers are keeping on eye on making the platform open and interoperable. I've seen this in Identity Management projects. The people implementing IdM within an organization are doing so as a service to the larger organization. Systems and applications throughout the organization need to be able to interoperate with the IdM infrastructure to grant and deny access, verify privileges, create accounts and more. The main point of the paper seems to be that an IdM infrastructure ought to be built with a clear architectural approach that meets these increasingly more prevalent requirements.

I see this as another important reason why businesses should be including a Virtual Directory as part of their IdM infrastructure. Adding a virtual directory into the solution immediately improves the solution's ability to be flexible and interoperable. Virtual directories enable organizations to make their identity data:

  • As open and available as the organization wants it to be
  • Accessible with a completely flexible and dynamic organizational structure that can be customized for each application that is accessing the data
  • Accessible via LDAP or virtually any other data structure
  • Accessible as a web service via DSML, SPML or other protocols
  • Available in real-time to applications across the organization regardless of protocol on the back-end data store

Plus:

  • The organization can continue to manage identity data in its preferred system and format
  • There's no need for multiple repositories to support multiple sets of application requirements
  • There is virtually unlimited flexibility in how data could be presented

Virtual directory as an abstraction layer eliminates much of the complexity associated with getting the other IdM components connected appropriately to the data stores. IdM services like provisioning, federation, authentication and access control can leverage a single point of contact to an organization's identity data.

The document provides a blueprint for next generation IdM architecture. And it's a blueprint that makes sense.

Network Layer Identity Management: Part I

Mark Macauley of Trusted Network Technologies has been making the case for Network Layer identity management. It's a very compelling concept. Eric Norlin of DIDW has also posted about NAC - and more than once. Eric sees NAC as front-and-center, one of the major IdM themes for 2006 and possibly the new provisioning. Another TNT blog makes the distinction between Network Access Control and Network Admission Control. It's an important distinction. I was recently at a trade show standing across from a large banner that read Network Access Control. When I inquired, I was given the spiel on what is really Network Admission Control (by Eric's definition). For that company, the distinction really doesn't matter - access to the network vs. admission to the network... who cares? They both sound the same. For TNT, it matters because they provide much more than just binary access to the network. For my purposes, I mean access control. That is, NAC as controlling who has access to what at the network layer rather than at the application layer.

Placing security at the network layer seems to be more secure than at the application layer. The goalie (for lack of a better analogy) wouldn't have to protect the goal if opposing players weren't even allowed on the field. And users who plug a non-compliant laptop into the network have no chance of reaching protected data. You need to be attempting access from a NAC-approved system in order to have any chance of gaining access to the requested resource.

There are still plenty of finer technical points (is packet spoofing possible?, etc.) about NAC that I don't fully understand, but I do see some business challenges. Assuming a NAC implementation works exactly as I would want it and there are no technical concerns, I still think there are some business-related hurdles to overcome before widespread adoption of network layer access control would take place.

1. From what I've seen, the IT industry has moved away from installation of agents on desktops and laptops. So, the carrot needs to be very sweet in order to convince IT managers that agents are the way to go. Complete enterprise single sign on and elimination of all user names and passwords other than the network is a pretty sweet carrot -- we're talking Carrot Halwa. But, is that really achievable? I have no experience with this so I'm actually asking. Have people achieved this?

2. Companies are comfortable with the idea of controlling access at the application layer. Applications have user names and passwords and identity stores and policies that control who can get in. Turning all of that off for every application across the enterprise is a very tall order. I believe it would take a significant amount of time to convince organizations that it's safe to open all the safes in the mansion and take comfort that only those with appropriate permissions to a particular safe will be able to get into the rooms with that safe. It may be purely an emotional obstacle, but one nonetheless. It's just the kind of obstacle that could prevent a very good technology from ever reaching widespread adoption. If I were a NAC vendor, I think I'd spend a huge portion of my time and budget attacking this fear. Find a single high-profile customer, give away the software and make the case. ...just an idea.

3. In the long-term, if organizations do begin to adopt a NAC identity management architecture, NAC vendors will need to either pair-up with the major platform vendors or prepare for a long hard fight. Building identity into network requests and responses could eventually become part of the network OS, desktop OS and server applications. Of course, no single vendor could solve the entire problem without some common standards. I think I smell a new set of protocols out on the horizon waiting to come to fruition as soon as customers get over the fears mentioned above.

...in part II, I'll discuss how I think application-layer identity management solutions (like MaXware's) add value to network layer access management solutions.

* Thanks again to Mark Macauley (who may or may not agree with my thoughts) for providing me some background info on TNT.

Friday, July 14

Business Value of the Identity Metasystem

For as many years as I can remember, I've been handing out this little pearl of wisdom:

The secret to life is perspective.

The main point, of course, is that ultimate happiness (life's true goal) is easily achieved - if not by accomplishing some measurable goal then by re-evaluating one's perception and understanding of the goal and making adjustments as necessary. I'm not saying it's OK to set the bar lower in order to make goals reachable. I'm saying that often what we perceive to be an important and worthy goal does not hold up under closer scrutiny. And a realignment of the importance of things will often help you see the positive aspects of a situation as outweighing the negative. I'm certain that I wasn't the first to achieve this realization, but I arrived at it when I was young and its truth has been consistently reinforced as I navigate through life.

How does it relate to identity management?

Well, it's analogous to the concept of context-based identity. Woven throughout the fabric of the Identity Management blogosphere is the notion of user-centric identity, the concept of context-based identity and the questions of who owns identity and what defines a system user's identity. In the real world, identity is contextual -- Your identity differs depending on the perspective of the viewer. Sometimes it's because you intended to shape that perspective but other times it's the viewer or the situation itself that shapes the light in which your identity is cast.

People necessarily present some unique subset of our overall identity superset each time we interact with the world. Even those we most love and trust will see only partial aspects of our identity. We don't expose our Las Vegas selves to our children. And we may not expose our caring, nurturing side at our monthly poker game. We all have friends to whom we expose aspects of our identity that we don't expose to others. Do you share your Star Trek convention identity with the people you work with? Or your girls-night-out identity with your local bank teller? Identity is extremely driven by context and perspective.

This is even more true in the world of systems and applications. Our electronic identity, as it's used in practice, is another small subset of your total identity. Often, your identity for any given application consists of authentication credentials and some application-specific identity information. The context boundary seems most often to be at the application but may also be driven by access method (phone vs. Web), geography (home vs. at work) or some other factor. The real problem that we (the Identity Management community) are trying to solve is: how to effectively and efficiently deliver the appropriate identity context to wherever it's needed in order to enable secure and easy interactions between people and computers (or any combination thereof).

So when the question of User-Centric identity comes up, I wonder about how any one entity could hold all (or even most) of its own identity information. Identity, being subject to perspective, is only half-owned by the one being identified. The other half is owned by the viewer of the identity and/or the situational context. Young children see their parents as super heros. Most of those parents (I realize not all) wouldn't hold super hero to be part of their identity if they were holding all the cards. The child needs to maintain some aspect of the parent's identity. In the same way, a financial services company defines and re-defines a customer's profile based on the customer's progress, the company's special knowledge and the performance of the economy. Surely, a consumer can not own those aspects of their own identity. So what are we really talking about when we discuss user-centric identity? Authentication? Authentication plus some basic profile information (age, citizenship, contact info)? Something more?

Bob Blakley recent posted a blog entry titled The Meta-Identity System in which he makes the case for Identity Oracles rather than Identity Providers. His points are important. In fact, it hadn't even occurred to me that anyone would choose to implement an identity provider that actually sends identity data rather than identity metadata. Bob's point is that identity providers in the Identity Metasystem ought to provide answers to basic questions rather than shipping real identity information. e.g. Am I over 21? rather than What is my birth date? But this information still seems very generic. The receiving application will still own relatively half of the user's identity (as regards to the user's interaction with this particular system). The user really only owns the portion of her identity that is non-specific to the application (or not context-restrained).

So is this entire user-centricity effort really about minimizing the effort to enroll in various sites by duplicating name, email, address and credit card? To make this huge undertaking worthwhile, I believe there needs to be more value to it. If it's only to simplify enrollment and authentication, then it's really just Microsoft Passport 2.0 but with your own choice of identity provider. Is simplified enrollment and authentication really the goal of user-centric identity solutions?

I realize that the concept that identity is contextual is not new -- it's even part of the original identity metasystem concept. And the metasystem also acknowledges that existing identity systems are not replaced by the metasystem -- they still own part of the identity. So, it's not asking us to give the entire identity to the user. I just want to be clear about what we stand to gain. Is it:

  • Fewer sets of credentials for users to manage?
  • Consistent user experience? (Are enrollment and authentication activities really confusing people with complex interfaces?)
  • Greater degree of user control over their identity? (really?)
  • Facilitate use of existing identities across boundaries? (like federation?)
  • New market opportunities? (examples?)

It sounds a lot like an authority-neutral Microsoft Passport. I'm unfortunately somewhat behind the eight ball on user-centric identity. From my perspective, it's been a debate about how to build the next generation Internet identity infrastructure and I've been pretty focused on the current generation. But I'm starting to give it some thought. I'm playing catch-up and I'm wondering: What am I not getting? Am I just underestimating the frustration associated with multiple sets of credentials? Maybe it's because I use a Digital Persona system don't worry much about multiple usernames and passwords. Enlighten me. What will drive the world to invest in an identity metasystem?

Don't get me wrong. I do get it to some degree. I realize the Internet experience will be better for everyone if we tackle the issue of the many identifiers that connect us one-by-one into systems and applications. I just don't know if I've seen a business driver that's big enough to drive an effort as big as a global identity metasystem. So clue me in. Is it just about making the world a better place? If so, this is a bigger story than I suspected.

Wednesday, July 5

Beyond Simple Federation

One of the typical scenarios we've encountered when discussing our federation solutions with customers includes dynamic account provisioning. A number of people at Network World in Chicago last week were asking me about this capability, so here's an overview:

Typical federation scenarios include a person attempting to access an application that lives outside of the security domain in which the person is currently authenticated. This is achieved by authenticating the user within the current security domain and passing an assertion to the requested application stating that the user is authenticated (and providing the user's identity information). This assumes that an account has been created in the foreign application for the user. The process of creating and managing accounts for this purpose is typically a manual process due to the distributed environment. Whether manual or not, the process that's in place usually kicks off whenever a new employee account is created within the originating security domain and ultimately concludes with an account being created in the external environment in a ready state - waiting for the user to come along at some later date and logon. Apparently, there are a number of possible reasons why you might not want accounts created for system users who may only potentially access an external application.

MaXware's Federation Server (MFS) is built on top of our Virtual Directory product (MVD). MVD enables account authentication to one or more heterogeneous back-end data stores. It also enables account creation to one or more heterogeneous back-end data stores. So, one of the scenarios we've implemented for a customer in the past (and will likely again sometime soon) includes on-demand federated account provisioning. In this scenario, the user is authenticated in their home environment and an assertion is sent to the external application environment. MFS on the application side leverages its internal MVD to dynamically create an account in the requested application. And to take it a step further, MFS actually opens a session on behalf of the user so that when the user is redirected to the application, there is no need to logon again. So, it also serves as a distributed SSO environment. But, that was the original point of federation, wasn't it? Pretty cool stuff. You don't have to worry about orphaned accounts if they don't get created in the first place. On-demand federated account provisioning -- the next big thing? Maybe not, but it's good to know that it's there if you need it.

Friday, June 16

Chief Identity Officer?

My colleague Matt P. recently introduced the concept of a Chief Identity Officer (CIdO) and a related Identity Management department. It's an interesting question -- who owns enterprise user identities? He's not asking who actually owns them now, but rather who should. If a CIdO presided over a department of IdM, which included HR, what would that mean for identity security and enterprise identity management? Well, I agree with Matt that one owner would certainly make IdM projects easier to manage, but that's not the greatest benefit.

I would think the CIdO would need responsibility for all user identities -- employees, partners, customers, etc.. She would find ways to enable the business while mandating the IT organization to implement solutions that follow strict security guidelines. All applications requiring user interaction would need to work through the CIdO office to get user enabled. In the real world, this seems like a long shot, but introducing the concept may provide a wake-up call to organizations with no executive sponsorship of user identities (and they do exist). I guess my vision would include a Director of Identity that reports to the CIO or equivalent. She would be responsible for compliance, attestation requirements, establishing Identity policies, ownership of IdM solutions, backup and recovery solutions for identity-enabled applications, etc..

Having a single office responsible for identity in an organization would yield numerous benefits. First, the people responsible for email systems, network OS, perimeter security, HR employee solutions, application development and others would be able to concentrate on their own responsibilities. Second, somebody would be 100% focused on how to provide the best identity solutions to the business while maintaining the highest standards of security. It's natural, for example, for application owners to make decisions that will enable their app users while diminishing audit capabilities. A director of IdM wouldn't think in those terms - she would need to find solutions that enable the business, facilitate ease-of-use and also maintain strict security guidelines. IdM solutions span the enterprise and the design, architecture and management thereof ought to be central. We've all heard the cliche - a chain is only as strong as it's weakest link. Well, if identity solutions are managed by strict policies from a single office, perhaps we would be less likely to lose a laptop holding the identity information of 250,000 people. In fact, we'd be less likely to have a laptop with important identity information at all. Less total links means less weak links in the chain. And to beat the analogy to death, a Director of IdM would mean somebody is there with a welding torch maintaining the chain and designing improvements rather than each group owning their own link. Something to think about. ...especially if you're one of those organizations with no executive IdM oversight.

Monday, June 12

Whodentity? Tor Who.

Mark Dixon posted a blog entry announcing a page called Whodentity? on which he lists a number of people who are influential Identity Management characters. In his blog entry, he mentioned Eric Norlin's Top 10 most important people in Identity and asks was he correct? I think the answer will (and should) vary depending on your perspective. For example:

  • For Identity Management solution implementers, I can't think of anyone who has provided more valuable information than Mark Dixon himself. His blog series on IdM Implementation Risks is well worth the price of admission alone and should be required reading for anybody planning an IdM project. I think we're still waiting for full entries on the final two risks (no pressure Mark).

  • If you're a product manager thinking about what features to next build into your IdM products, you might look to Microsoft's Kim Cameron for food-for-thought about the future of Identity technologies.

  • If you're an enterprise IT manager and need to understand what products are out there and how they might help your organization, you might turn to Dave Kearns or Digital ID World's Phil Becker & Eric Norlin.

So I think every person's top ten may be different, depending on what information they need. Mark's post inspired me to re-read the Top 10 and I saw in the number one spot Jamie Lewis of Burton Group. No doubt that Jamie has contributed a great deal to the industry. The text states that he wrote the original white paper outlining the concept of a metadirectory and that the white paper "essentially gave birth to the identity industry". This may be true, but it got me thinking. My research tells me that the Burton paper came out in 1996 shortly before the ZoomIt metadirectory product, which was also released in 1996 and later became Microsoft Metadirectory Server. And so we can conclude that people building Identity Management expertise in 1996 are considered innovators and should be praised for their foresight.

And so with that background, I'd like to contribute another name to the discussion - MaXware's own Tor Even Dahl. Tor Even wrote what I believe to be the world's first metadirectory product in 1995. That product remains to this date at the core of our data synchronization and provisioning products. And although it's architecture was different than what ZoomIt was building around the same time, Microsoft re-built MIIS to have a similar architecture when they started from scratch in 2002 - which only proves that that the architecture has withstood the test of time. Tor Even went on to write the world's first virtual directory product in 1998 (then called an LDAP proxy). So, here's a guy who wrote both the first metadirectory and the first virtual directory but has no real industy recognition. He probably likes it that way. Like many Norwegians, Tor Even is humble and he'll probably cringe to see his name mentioned like this, so let me apologize in advance (Sorry Tor Even). But, I thought his accomplishments worth mentioning to the larger community.

With the help of a few other people, Tor Even Dahl built one of the first pure-play Identity Management companies and today remains the CTO of a successful IdM company that has over 300 customers in 30 countries around the globe. So, here's a toast to Tor Even Dahl, as they say in Norway, "Skal!"

...and thanks Mark for Whodentity? It's a list that is sure to provide many interesting viewpoints.

Monday, June 5

MaXware News

The whitepaper I mentioned previously, Identity Management in a Service Oriented Architecture (SOA) is now available on the MaXware website.

Also, MaXware is offering a lite version of its Data Synchronization Engine free of charge through July 31, 2006. DSE Lite is a great way for organizations to build real value while getting familiar with MaXware products. It provides a flexible data synchronization solution for bi-directional sync between any two data repositories. There is also a comparison between DSE Lite and the full version of DSE.

Lastly, MaXware just released a new version of the MaXware Virtual Directory with many upgrades, including built-in connectors for SAP and Salesforce.com. These connectors enable querying and account provisioning to/from these systems over standard protocols (LDAP, SPML, DSML, etc.). Kudos to the MVD development team for a very nice release!

Friday, June 2

Password Solutions

One of the primary business drivers for Identity Management has always been password management. System users don't want to remember multiple passwords -- especially if the systems require password changes on different cycles and have varying policies for password length and structure. The frustration is more than understandable. And we've all seen the reports that state the cost associated to that frustration. Password-related Help Desk calls cost $x per call and x number of calls each year per each thousand users equals the enormous cost of managing passwords. I didn't even mention the security implications of having too many passwords.

What we haven't heard often is that there are a number of possible solutions to this problem. We may have heard each of these solutions individually, but usually from a single perspective. I believe that technology infrastructures are complex and dynamic. They're not one-size-fits-all. Every enterprise has a unique environment and somebody needs to (or at least ought to) put forth some actual thought about how to best approach any given environment. So, with that, here are four approaches to the problem of passwords. They are not mutually exclusive and depending on the size of an organization, it might make sense to look at more than one of these approaches.

1. Password Management (Reduced Sign On)
These solutions synchronize passwords across multiple systems. They enforce strong passwords that adhere to the policies of each of the connected systems and remember password histories so that people can't repeatedly use the same password. One of the nicest features of password management solutions is that they allow end-users to recover lost or forgotten passwords through some type of self-service mechanism. This cuts the helpdesk costs and leads to real ROI. System users are left with multiple user accounts which have a common password.

2. Centralized Authentication Infrastructure (Single Sign On)
These are the typical SSO solutions that leverage some type of access policy server. The SSO system identifies a user and determines whether or not to grant access to the requested resource. SSO vendors typically leverage LDAP to authenticate users against an enterprise LDAP directory. If an organization prefers to manage data in a relational database, the SSO system can send LDAP requests to a Virtual Directory which would then pass the request to a back-end database (or any other set of data stores). SSO solutions centralize authentication to a single account and enable management of permissions to access enterprise resources. On the back-end, these systems require access to an authoritative identity data store and therefore usually require either some form of identity data synchronization or a virtual directory implementation (as previously mentioned). One example of multiple back-end data sources is leveraging Active Directory for employees and an Oracle database for non-employees.

3. Single Authentication Source
In this scenario, an organization would forgo the expense and infrastructure of implementing an SSO environment, but would implement a single authentication source with varying authentication mechanisms and logic for each application. If there are a limited number of applications, the value of a true SSO implementation may not be fully realizable. Each application in this scenario would verify credentials against a Directory or Virtual Directory server. Just like in the SSO scenario, a Virtual Directory would enable lookups to one or more data sources on the back-end. This allows the organization to leverage a single set of credentials for authentication to multiple applications.

4. Enterprise Single Sign On (ESSO)
ESSO solutions reside on an enterprise network and manage user access to various resources. The user experience is pleasant because there is only a single point of authentication. Each application maintains its own set of credentials, but the ESSO solution maps user accounts for each system and performs the authentication process on behalf of the end user. Some things to consider are application password change policies and how access from external locations would be handled. The user still has multiple system accounts each with its own password, but the logon process is made transparent by the ESSO solution.

Federation solutions offer another way to handle authentication across multiple systems and may be able to provide an additional solution to password challenges. However, federation solutions are typically implemented for authentication across security domains whereas passwords are typically managed within a given security domain. User-centric identity solutions are another way to approach access across multiple security domains. Ultimately, organizations should consider the alternatives and implement solutions that will provide value within their own given environment based on their own specific requirements.

Tuesday, May 23

Identity Management for SOA

From my soon-to-be-released MaXware whitepaper titled Identity Management in a Service Oriented Architecture:

As companies continue to deploy identity management solutions throughout 2006 and 2007, they should do so with an eye toward their enterprise SOA strategy. Organizations that implement identity management solutions will benefit most over the long term when IdM solutions are offered as a service. Companies are beginning to move toward Service Oriented Architecture models and adoption is already gaining momentum. IdM will be a critical component of SOA infrastructure securing service access to applications as well as access to the services themselves. Applications will be looking to access identity data throughout the organization over standard service protocols like DSML and SPML. MaXware is uniquely positioned to meet the current and future needs of organizations looking to build, manage and secure an SOA infrastructure.

There is certainly a significant amount of hype about SOA, but that doesn't necessarily mean that it's only hype. SOA done right has the potential to greatly reduce application integration costs, which has been an albatross for many large organizations. I expect a gradual adoption over the next few years as software vendors provide SOA hooks into their software. I don't really see companies rolling out enterprise-wide SOA infrastructure, but as more apps become service oriented, communication between platforms will become much easier to facilitate. Organizations will obviously need to consider security for the SOA infrastructure but they should also expect that Identity Management solutions (provisioning systems, identity data stores, etc) should be available as a service over standard protocols. If you're interested in the whitepaper, check our web site -- it should be available within the next few days.

Monday, May 8

SAML for Secure SOA

I've been researching Service Oriented Architecture (SOA) and came across this article on the JavaWorld site - Secure your SOA.

from the article:
"In the past, most systems were designed under the assumptions that a single system would posses all of the information necessary to make access control decisions and all the data would be recorded in the audit trail. However, large-scale distributed systems are always built by multiple organizations with a mixture of products. Thus, users may be authenticated by different authorities using different methods. In addition, different authorities retain different information about user properties and attributes. Centralizing all capabilities and information is just not practical. SAML provides standard formats to express authentication and user attributes, and the protocols to request and receive. "

With regard to service-enablement for our customers, I've been primarily focused on the ability of our Virtual Directory to enable service-based access (DSML, SPML, other) to identity data in practically any back-end format. I see this as very cool technology. Simple, but effective.

This article lays out another concept I've been thinking about, which is IdM as providing security for the SOA environment itself. That is, playing its role in locking down access to and from the services themselves. I've seen a number of articles and analyst reports that expect SOA deployments to take off throughout 2006 and 2007. Even if companies aren't ready to transform their entire infrastructure to an SOA, we can expect some degree of adoption within most large organizations.

An efficient and flexible Identity Management infrastructure is going to be critical in securing access to the SOA services themselves and their access to other systems and applications. Our Federation Server, using SAML, is an ideal candidate for enabling a secure SOA environment.

The scenario would look like this:
  1. Application-A (portal) requests data from Application-B (HR System).

  2. Federation Server authenticates Application-A and generates a SAML assertion.

  3. Application-A forwards the assertion to Application-B, which verifies the assertion and grants or denies access to its resources based on the information in the assertion.

The scenario would work the same if it were a user attempting to access an application instead of an application-to-application interface. Or potentially, in a portal scenario, the portal requesting access on behalf of the user with multiple layers of trust. Interesting stuff.

The next step might be to add-on a provisioning environment that captures audit data on service access rights for SOA governance. I'd be curious to hear how people are implementing SOA governance. Would traditional user-provisioning systems fit the bill?

Wednesday, April 26

Security and Password Myths

Kaliya Hamlin pointed to an article about password security and what its author (Prof. Eugene Spafford) calls security myths. It's an interesting article, but I don't agree with the main point, which is that mandatory password changes do not increase security. He calls these policies folk wisdom and claims that best practices are "intended as a default policy for those who don’t have the necessary data or training to do a reasonable risk assessment". Well, I don't agree with that statement or Prof. Spafford's conclusion.

Best practices as I use the term describe an ideal state without knowledge of a given environment. Every environment has exceptions and special needs. Therefore, it's not always possible to implement best practices. But, they should serve as an ideal to work toward. Default policies, on the other hand, are often what's easiest to implement -- just ask any company that sells hardware for wireless home networking products. These products are usually shipped with default settings that make it easy to setup. Best practices, however, require that the installer configure encryption keys that prevent people in close proximity from accessing the network.

Let's move to the password change policies. While there are certainly (as Prof. Spafford writes) a number of password failure modes, these policies are in effect to minimize the effectiveness of one of those failure modes - cracking. We may only differ in our definitions of weak cracking. This article by Geodsoft discusses password cracking techniques. The takeaway is that with a strong password policy, a brute-force cracking attempt will take over two months at 6 characters and two years at 8 characters. It's certainly possible to improve that timeframe with heavy hardware infrfastructure, but I think the policy will serve it's purpose of reducing the threat. And that's ultimately the goal. We all know that nothing in IT is 100% secure, but we should probably implement as many practical policies and solutions as possible to reduce the potential threat.

Thursday, April 20

IT Pragmatism

There's been a lot of blog discussions lately about IT practicality, software companies and enterprise architecture. It's refreshing to hear people taking a step away from large, complex and "enterprisey" architectures. I'm a pragmatist. I think IT solutions should solve a business problem and NOT introduce new ones. When I was in the services business, I tried to design and implement solutions that were simple, easy-to-use and efficient. I joined MaXware because they have very practical software products that work without complex architectural requirements. People that want to spend years to implement an IT solution seem to me to be confused - the business requirements will change drastically over those years. You need a direction and a goal to work toward, but implement quick-win real-world solutions that provide value and you'll move toward your goal while minimizing cost and frustration. I think this has been especially true of Identity Management solutions and will continue to be so for years to come. So, if you're embarking on an Identity Management journey, try to work out a 3 month cyclical pattern of building real business value while driving toward a long term goal. You'll generate executive and team support for the project by showing quick success and you'll be ready to adapt as the business landscape changes.

Wednesday, April 12

RSA Secured and Next Generation IP Networks

MaXware is now RSA Secured for SecurID Authentication and ClearTrust Access Management. Not really anything new from a technology standpoint, but worth mentioning.

Also, thanks to Kim Cameron for his nod toward Marcus Lasance's article in European Communications. Marcus discusses Identity Management for IMS (IP Multimedia Subsystem). Marcus is well-seasoned in telecommunications and really knows his way around that arena. I'm interested to see how triple- or quadruple- service offerings come to market and where IdM will play its part.

Thursday, March 30

Common Virtual Directory Scenarios

The discussion regarding possible uses for Virtual Directory is on-going. The following are 8 easy-to-understand scenarios for Virtual Directory in no particular order. This is by no means an exhaustive list, but I think it covers the simplest scenarios. I look forward to questions or comments.

  1. Protocol Translation - Provide access to relational and other non-standardized data over standard LDAP and Web Services protocols without altering the data.

  2. Web Service Enablement - Respond to identity data requests made via DSML, SPML or any other service-oriented data format (standards-based or custom).

  3. Multi-Repository Search - Enable a single search over standard protocols to return a single clean result-set containing identity data that resides in multiple repositories in multiple formats.

  4. Joined Identity View - Enable a search that returns a view of single identities that are comprised of data from multiple repositories. e.g.) A single user record is presented with name and phone number from the HR system and the email address from Active Directory.

  5. Permission-Based Results - Enable a customized view into a single data universe based on which application or which user is performing the search. e.g.) Employees inside the corporate firewall see a full view of fellow employees while customers accessing an external-facing application see a reduced set of attributes and phone number is formatted using the (toll-free + extension) format.

  6. Dynamic DIT - Build an on-the-fly Directory Information Tree based on identity data attributes. e.g.) The application calls for LDAP views based on job title so the virtual directory dynamically presents an OU for each job title in the database and presents employees within the appropriate OU based on their job title.

  7. Authentication - Enable pass-through authentication from a single point of entry into multiple identity data stores. e.g.) Authentication requests are directed to a single point. The Virtual Directory authenticates non-employees against a back-end Sun Directory and employees against Active Directory.

  8. Real-Time Data Access - Provide real-time access into back-end systems. Because requests are passed to the originating data source, the search results can be as real-time as required.

Summary

Virtual Directory technologies eliminate boundaries. Hassles related to LDAP object types, attribute definitions and other schema-related issues are eliminated by virtualizing the view into the backend identity stores. You're no longer limited by the existing data format or database branding. There's no requirement to migrate the data from a relational database into an LDAP directory in order to make the data LDAP- or Web Service- accessible.

Thursday, March 23

Showdown: MIIS vs. DSE

Prior to joining MaXware, I spent a significant amount of time working with Microsoft Identity Integration Server (MIIS). Since joining, I've had a lot of questions from old friends about how the new software stacks up (and vice-versa from new friends). So, I put together a summary of what I've found thus far. I'll discuss MIIS and MaXware Data Synchronization Engine (DSE). Thanks to its very descriptive product name, you can probably guess what DSE does. MaXware also offers the MaXware Identity Center (MIC) for advanced user lifecycle management (workflow, etc.). But for now, I'll stick to data synchronization.

INFRASTRUCTURE

I think the most obvious difference between DSE and MIIS is infrastructure. MIIS requires Windows Server 2003 and SQL Server. For practical purposes, it also requires Visual Studio. DSE, on the other hand, can run on any platform supporting Java Runtime 1.4 or later. DSE also doesn't require a central database, but can leverage Oracle, SQL Server, MySQL, Access or any other ODBC, JDBC or OLEDB compliant database. With DSE, you can build very simple jobs that read/write directly to/from one data source directly to another without setting up a central database. If you want to perform data joins, you can setup a collect database as part of the solution (similar to an MIIS metaverse). So, the DSE footprint is very small. And DSE runs very efficiently using Access as a central store even with a few hundred thousand entries. If performance is critical, support for the bigger databases is available.

DELTAS

Another advantage of DSE is that it doesn't rely on deltas to be managed at the connected data store. So, for example, you don't need to enable the Sun directory retro changelog in order to only process changed entries. DSE handles this by its ability to store its own delta table in the central database with a hash of each record and the ability to compare the hash before performing an action on the record. If you'd prefer to rely on the changelog, that's OK too. You may be able to improve performance that way. However, you might lose some level of audit capability since DSE won't know what attributes have changed.

CONFIGURATION

Another difference is in the storage of server configuration. MIIS stores its server configuration in the SQL Server database. DSE stores its configuration in a single XML file. MIIS, though, does a pretty good job of providing XML-based server configuration export and import. It seems a little more complicated than DSE, but not by much.

SERVER FLEXIBLIITY

MIIS uses a single metaverse for each MIIS instance. So, a set of Management Agents writes to and from the metaverse to perform actions on the user objects. This is fine until you want to manage multiple sets of users. If, for example, you want to manage test data and production data, it becomes difficult to manage due to join rules. It's easy to join incorrect objects by running an MA against a test directory instance with different data. With DSE, jobs are organized in groups. Each group is a set of Passes (a pass is similar to a management agent and a corresponding run profile). And you would setup each group to have its own collect database (or metaverse). This allows you to do more with a single instance of DSE and provides extensive flexibility.

DIRECTION

DSE passes are one-way. MIIS management agents are bi-directional. So, if you want to read and write to a given directory, you would create a FROM pass, probably run a few other from passes and perform some actions on the data, then you would run a TO pass back to the directory. In the MIIS world, it's common to run a single MA multiple times to capture data changes that happened after the initial import run. While I prefer the way DSE handles this, I don't see this as a big advantage for either solution.

SCRIPTING

On the Windows platform, DSE can support VBScript, JScript and Perlscript. On the Java platform, DSE supports Javascript. Obviously, VBScript can call a DLL written in VB or C# and Javascript can call compiled Java code. MIIS supports C# and VB.NET. Both provide a set of native functions to interact with objects as you're moving them around. Openness is generally a good thing -- especially if you like Java, but VB.NET and C# can probably perform any type of functionality that you're looking to accomplish. I think DSE has the advantage here, but if you will be using C#, MIIS is more directly integrated which can be considered a positive.

AUDIT DATA

Both solutions provide logging, but DSE has the advantage of local delta tables for audit and reporting. Also, the separation of TO and FROM passes also make it easy to include a pass that writes to an audit table before losing the old and/or new attribute values. While XML data mining is possible with MIIS, I would much rather work with DSE if advanced audit reporting is a requirement -- especially at the attribute level. I'm not really sure if attribute-level auditing is possible with MIIS without some heavy scripting being built into each attribute flow rule.

SQL FLEXIBILITY

DSE FROM and TO passes have a SOURCE tab and a DESTINATION tab. When importing data from a SQL database, the SOURCE tab allows you create a custom query that is as advanced as you would like. In addition, if you are creating a TO pass to any type of data source, you can write an advanced query against the collect database to select whatever subset of users you can imagine. MIIS is limited to import only from a single table (or view). And management agent operations are run against the entire connector space set for that agent (or the deltas). MIIS lacks the ability to create custom queries that define a subset of users to which a set of actions would be applied.

DATA TYPES

MIIS is obviously an identity data tool. You can modify the metaverse schema to accommodate whatever object types you need, but it's designed to manage identity data. DSE is designed to handle any kind of data. There are no pre-configured data structures. DSE does schema discovery on any kind of data source and can easily be used to synchronize any type of data. For example, if you want to synchronize customer order data from one type of database to another, DSE can handle it as easy as managing the customer identity data.

COST

I think they two products are similar in price, but DSE allows the customer to leverage existing server and database infrastructure rather than requiring new licenses for Windows Server and SQL Server.

CONCLUSION

I realize that most of these points favor DSE over MIIS but that wasn't my predetermined intention. These are what came to mind when thinking about the differences between the two products. If you are a C# developer who lives in Visual Studio and already owns an open SQL Server license, you may prefer MIIS. Outside of that scenario, I think there are a number of compelling reasons to consider DSE. I'd be interested, though, in hearing opposing viewpoints -- there may be a perspective that I haven't considered. My hope is that this summary paints a picture of the basic differences between the two products and allows the reader to think about how these differences may or may not affect a given environment.

Wednesday, March 22

Identity Management Project Continuum

I recently had a discussion with some sales folks who were interested in the Identity Management project lifecycle. The question came from a product sales perspective as in How do we know where a client stands in the big picture of IdM? ...or more pointedly How do we know which product to pitch to a given company based on how far along they are? I laid out what I like to call the Identity Management project continuum.

Implementing IdM is not a single project. Nor is it even a few stand-alone projects. I call it a continuum. The folks at TNT recently posted a blog describing IdM as a lifestyle. I think that's a great way to think about it. I was, though, a little annoyed about their claims regarding software vendors. They suggested that because we sell software we don't understand the customer perspective. I think they're wrong about that -- at least with some of us. In an ideal world, businesses looking to deploy IdM would have someone competent driving the boat -- maybe an employee, maybe a consultant, but probably not a software vendor. We provide tactical tools to get the job done but that doesn't mean we don't get the big picture. We just typically wouldn't want ownership of the big picture. That's not our focus.

To get back to the topic, the continuum is not black and white. It necessarily varies for every business based on their data, infrastructure, processes and business needs. For any given business, the phases will occur in different orders, their prioritations will vary and some phases may not be required at all.

Below is a sample outline of what I described as the continuum. It's meant to be a general guideline and a starting point for discussion. It's certainly not trying to be a one-size-fits-all project plan

Vision and Roadmap - This is important. You should identify and clearly document the goals, business drivers and overall approach. List the general timeframes and expectations.

Data Cleansing and Reconciliation - Most organizations have multiple data sources that are stored in different formats with different technologies. Step one is usually identifying the data sources, cleaning the data as-needed and creating attributes that can be used to join records together.

Basic Account Provisioning - The first step provisioning may be as simple as automated account creation but could also include single-step workflow or automation of group/role memberships.

Basic Password Management - Management of passwords is often a key driver for IdM projects due to the organizational cost savings.

Basic Auditing - This step should involve initiating the collection of audit data and a few basic reports. Advanced reporting based on captured data can be implemented down the road.

Build/Strengthen Centralized App Authentication - This can be implementing SSO, consolidating authentication mechanisms, reducing the number of authentication stores or otherwise improving the application authentication infrastructure.

Advanced Provisioning - Build upon the basic provisioning infrastructure with advanced workflow, additional business rules, improved deprovisioning functionality and inclusion of additional data sources and/or applications.

Internal Federation - With an established infrastructure for authentication and entitlement, federation may be the next step. Here you adopt a standard and think about how you want to pass authentication information across security boundaries.

External Federation - After the basic federation infrastructure is in place, you may be ready to enagage in cross-organizational federation with customers, service providers and partners.

One other thing...

While I'm writing about projects, I'd like to give a nod to a great set of blogs by Mark Dixon about IdM project risks. While I don't think the ideas are completely original (I'm sure Mark would agree), they are indeed brilliant. And Mark organizes and explains the information very well. If you are embarking on an IdM journey, these are a must read:

Seven Identity Mgt Implementation Risks, Mark Dixon (1/25/06)
Identity Risks - Poor Pre-Project Preparation, Mark Dixon (1/31/06)
Identity Risks - Poor Requirements Definition, Mark Dixon (2/04/06)
Identity Risks - Large Initial Scope, Mark Dixon (3/14/06)
Identity Risks - Inexperienced Resources, Mark Dixon (4/14/06)
Identity Risks - Poor Project Methodology, Mark Dixon (4/24/06)
Identity Risks - Scope Creep, Mark Dixon (7/26/06)
Identity Risks - Not Using Available Support, Mark Dixon (7/27/06)

I'm really glad Mark is writing about these. If he hadn't, I might have felt the need to try it myself. I probably wouldn't have done such a nice job and definitely wouldn't have had the audience reach. ...I look forward to reading more from Mark about the four other risks.

Tuesday, March 21

MaXware HQ: Trip Report

On Saturday, I returned from Norway just in time for my family's St. Pat's celebration. It was a very impressive trip. ...and the party wasn't bad either. The folks in Trondheim are certainly on the ball. The first few days of training were focused on our synch and provisioning products. On day four, I learned how to implement all of the virtual directory scenarios I've been talking about in this blog. Day five was all about federation and federated auto-provisioning. Very cool stuff. These guys are brilliant!

I heard a story that the concept of Virtual Directory came out of a discussion between Kim Cameron (with ZoomIt at the time) and our CTO. Something like, "wouldn't it be nice if we didn't need a persistent data store between the originating data stores?". Well, he went back to the office and wrote what I believe to be the first Virtual Directory (then called an LDAP proxy). That's a nice legacy.

Some of the training discussion was review, but I also learned a number of advanced techniques. For example, our data synch tool doesn't require the data source to keep track of delta changes because we can store the deltas in a central database, which is vital for auditing down to the attribute level. We can also rollback account provisioning if creation fails in one of the downstream apps. And in one day, I installed MaXware Virtual Directory and setup custom directory views based on logon, created a joined account scenario, searched a SQL table via LDAP and restructured the virtual LDAP DIT based on the querying app. I was amazed at how simple this stuff was to install and configure. (Sorry if I sound like an infomercial, but I am very quickly learning to be product biased). It's a weird feeling to come from a product-agnostic environment (where I always preached product independence) and now find myself part of a product company. When I joined the company, I knew MaXware by reputation, but didn't have much hands-on experience. So, I'm glad to find that when I look under the hood I'm seeing some very nice technology -- feature rich, easy to use and a solid, mature code-base.

I was also impressed with the Norwegian people. Not that I was surprised, but everywhere I went people were helpful and friendly. And what a beautiful country! I really couldn't have asked for more.

Wednesday, March 8

One more post on Virtual vs. Meta

So, I already wrote that the question of Virtual vs. Meta is not the right question - they are complimentary solutions. One more thought on identifying where each would fit.

Every organization embarking on an IdM journey needs to begin by identifying data stores and collecting, cleansing, transforming & reconciling data. At this stage, traditional synch tools (metadirectory) are probably the right tool for the job.

If you already have a reasonably good set of data and are looking to provide additional or customized views into that data (via LDAP, SOAP, etc.), then Virtual Directory is probably the answer.

Of course, this is a very generalized view, but I think a good starting point. One of the things that make this question confusing is the functional overlap between the two solutions -- and there is plenty. But the point is to find the best fit solution for a given set of business challenges.

Thursday, February 23

Internal users, AD password synch and Virtual Directory

At MaXware, we have a product that hooks into Active Directory to capture password changes. Once the change is captured, we can synch passwords downstream to connected data stores (directory, SQL DB, etc.). Like other implementations of this type of solution, ours requires having a component installed on each AD DC to identify and pick up the password changes. Although it's not generally preferable to install software on Domain Controllers, it serves a specific purpose in this case and AD administrators have been known to give-in to their business counterparts to make it happen.

The underlying need is probably obvious - multiple apps each have associated passwords. And more passwords means more to remember, less likelihood of utilizing strong passwords and higher likelihood of non-secure password practices (written notes, etc.).

As we all know, another way to solve the problem is to implement Single Sign On or Reduced Sign On solutions that enable applications to leverage a common authentication mechanism (or at least a common auth store). These solutions are complicated to implement and costly. And you usually need to find a way to synch with AD anyway since corporate users typically change their passwords using Ctrl-Alt-Delete.

ADAM directories serve as one possible solution for some of this pain. You can implement an ADAM directory to serve as the application identity store or even as a central authentication store and pass authentication request credentials through to Active Directory. This means a single password stored in Active Directory that can serve the network environment as well as the apps that leverage ADAM as the Auth Store. If you centralize the store, you don't really NEED to centralize the authentication mechanism, which might be nice in some cases. In environments with a large amount of custom-developed applications, a centralized mechanism would probably provide significant value even if there's already a common authentication store.

Wouldn't it be nice, though, if you could do the same thing without needing to implement ADAM directories and without storing the AD password in two locations? What if a Virtual Directory could pass authentication requests to Active Directory the way that ADAM does? You could potentially use the Virtual Directory to expose only specific attributes to given applications, reduce the overhead of an additional directory infrastructure, eliminate the need to store the AD password in multiple locations (and the need to install agents on AD DCs) and still achieve authentication to multiple systems using a single set of credentials.

[UPDATED Mar. 8, 2006] - Shorty after posting this, I was informed that MaXware Virtual Directory (along with its competition) does, in fact, support passthrough authentication to AD. I guess I thought ADAM was doing more than a simple bind, which is not the case -- I think I learned that years ago and forgot it again. So, I removed the question on technical feasibility. I guess the question becomes why aren't more people talking about this? It seems like it would be an incredibly useful application of our product.

[UPDATED Mar. 18 2009] - I posted more information about ADAM passthrough authentication.

Tuesday, February 14

The Value of Virtual: Part II

For the past week or so, I've been thinking about the value proposition for virtual directory technology (Virtual-D). Since I was first introduced to it, Virtual-D was presented to me as a new way to sync data -- an alternative to metadirectories and traditional sync tools. So, I initially thought about its value in relation to that of metadirectory/sync (as in Virtual-D vs. Sync).

Today, a co-worker and I discussed the significant overlap in the two feature sets and the various scenarios that might call for one or the other. Our enlightened conclusion was that Virtual-D and sync tools are complimentary parts of a complete IdM solution.

Then, on the drive home, a light bulb went on. Virtual-D doesn't replace metadirectory, it replaces... [drum role] directory. Hence the name. In an end-to-end IdM solution where you might want multiple directory instances with application-specific attributes and/or security mechanisms, you can replace much of the cost and complexity of numerous directory instances with a Virtual-D solution.

If the goal is to LDAP-enable an application (e.g. SSO, white pages) with enterprise identity information from existing clean data stores, Virtual-D is much less complicated to implement than traditional sync solutions. It doesn't require another data store and it helps circumvent typical political data ownership issues. However, Virtual-D requires clean, current data.
So, if you don't have good identity data available, you might look to aggregate data from the numerous data sources around your enterprise and to create an enterprise directory (or database). This is probably best accomplished using traditional data sync tools. Then, Virtual-D can use that newly compiled identity data store to expose relevant subsets of your identity data to your various applications. And without the need for additional directory instances. And it does so regardless of the data layout or the technology used by the sync tools.

With Virtual-D in the toolbox, it's easy to see why your metadirectory and your enterprise directory don't actually need to be directories at all. They can be relational data solutions. It's easier to store, manage, and sync data in relational formats. And most companies already have relational database expertise (and usually even licenses). When it's time to expose the data to an application, Virtual-D presents it in LDAP format achieving application interoperability and minimizing risk by presenting only relevant attributes to each app.