No matter how good the data is that you have about your people, there is no finite set of data that can't be improved upon by incorporating even more relevant data. Even if you've done a great job capturing and organizing your internal information, there is more information outside your firewall that paints a more holistic and accurate view of who your people are.
Done thoughtfully, the aggregation of internal and external information makes experts and expertise more findable within your firm. In fact, our research showed that people were more likely to keep their information up-to-date on LinkedIn than they were in any internal repository. By pulling in outside data, we think we've found an approach that makes it easier to find people based on information about them, regardless of where that information is stored. We also realized that a fortunate side-effect of doing this, is that our people can compare their internal and external "digital footprints" to ensure they are consistent.
When working in the highly risk-averse culture of a corporation, you learn that there are flavors of risk aversion. When it comes to the risks around people data, most of those flavors have to do with fear of getting the pants sued off the company. Some of the more challenging flavors of risk are:
It became clear in our conversations that the thing the lawyers were concerned about were situations where discrimination could be implied. They were particularly sensitive about user-created information that could adversely affect staffing decisions. In order to avoid delving into some of the icky real-world scenarios where this could happen, I created some characters to tell the story.
On the one hand, we have Joe Juggler. He's an ideal employee who has performed exceptionally on all his projects. When he is not wowing his clients and managers with his work, he likes to let off steam by juggling. He is passionate about juggling and sees it as a metaphor for his work style - being able to keep several balls in the air at once.
On the other hand, we have Susan Staffer. Her job is to put together the best possible project teams. She looks at a wide variety of employee profiles, and uses the information available to assemble an "A Team" to for each project. And, oh yeah... She HATES jugglers. She had a bad experience in college. She ends up staffing someone else less qualified to the team even though Joe expressed interest and was recommended for the assignment.
As a result, Joe sues...
The challenge is to "protect your jugglers," which became our mantra in the process of negotiating for access to data. We want to allow Joe to express who he is without creating a situation where Susan could unwittingly discriminate against Joe. It's probably best to train Susan and her colleagues to recognize actions that could be trouble, but any perception of risk will give the lawyers pause.
Data privacy laws
We also realized that in the USA, we are far less stringent about data privacy than our overseas colleagues are. In fact, the term "Axis of Evil" was used to describe the USA and its position on data privacy. Be aware that once data begins to flow across borders, you may not be able to deliver on all the capabilities you envisioned, because laws in another territory might not permit them.
Terms & conditions
Usage of a platform is usually done under a set of Terms & Conditions. This is made more complicated when a platform is externally hosted. Then, not only is usage subject to your internal Terms, but the Conditions of the external provider's platform as well. Pulling data from these platforms into a new environment (however beneficial it may be) will be a challenge. The reason is that the language describing how that data will be handled is usually specific to that platform. Pulling data from that platform into another environment creates a legal gray area. If there's anything risk folks hate, it's a gray area.
Many of the risk concerns go away if the user specifically grants permission for the data to be used in a new way. To that end, you can have your users opt-in to allow that data to be used in a new context. This is probably most easily accomplished if you can show the value to the user in doing so. We were pretty clear that doing so would augment your ability to be found for the things that you know about, as well as the ability to compare you internal "digital footprint" to your external version. Even the most draconian data protectionists can't prevent a user from sharing what they want to share.
Another benefit is that if your people are opting-in their information, you avoid the fuzzy matching problem with external data. You don't want to make someone look like a juggler if they're not. There may be unsavory types out on the internet that share a name with one of your people; you want to avoid crossing those streams.
By putting control into the user's hands, you relieve the risk pressure. The challenge then becomes showing the value of doing so to your user.
While the risks may seem steep, the value gained by creating a more comprehensive body of data outweighs the risks. Allowing for your users to opt-in information, you can circumvent the majority of objections.