TDWI Articles

Why Generative AI Will Change Employee Provisioning, Dynamics, and Conflict

What happens to an employee’s AI training data when the employee changes companies?

Typically (at least in tech), companies are expected to buy the hardware and software employees need to do their jobs. Generative AI potentially changes things. Over time, it learns how to adapt and customize the related application to the employee using it.

For Further Reading:

The Problem and Promise of Generative AI

The Importance of Generative AI and How Education Is Getting It Wrong

How AI Will Advance This Year

This will have two immediate impacts. First, it will make it increasingly painful for an employee to change companies because they’ll have to redo all that training. Second, it will force employees and companies to rethink who owns the tools the employee uses.

Let’s explore the consequences of these impacts.

The Mechanic Example

I was a Jaguar mechanic years ago but never made Master Mechanic. You could generally tell a Master Mechanic by the tool chest he or she owned because it would contain tools that could cost tens of thousands of dollars. These tools, some of which were custom made, were part of what the mechanic brought to the job. Owning these tools was factored into how much a mechanic was paid (Master Mechanics often finance their tools, and their salaries needed to take those costs into account).

I think this same thing will happen when generative AI bleeds through the applications that employees use. The tools will increasingly tie tightly to how the employees work and contain years of employee training as they advance to do more and more of the repetitive work the employee does automatically. Think of an apprentice. As the apprentice learns more and more about how the master works, he or she can increasingly do more than just assist, and, over time, will become a master themselves. These tools learn through observation and will gain capabilities over time due to that observation.

Thus, in the future, experienced employees will be expected to come to a new job with their generative AI-enhanced tools (or at least the training data) so they can hit the ground running and don’t have to spend months or years retraining a new set of tools.

Much like it was with the Master Mechanic, this should result in income incentives that would be tied to having your own set of tools. Rather than restricting the employee’s tool use, firms will increasingly allow their employees to use tools they have trained over those the company owns that have not been trained to maximize productivity.

Security Risks

This “bring-your-own-tools” practice will open some clear security risks. If the employee brings in their own apps and personal hardware, how will the company protect its intellectual property from exiting with the employee on the employee’s hardware and cloud platform when the employee leaves?

In addition, some of the training data that was used to make these tools smart may also contain proprietary information, making the transfer of the data risky. So, although the employee should own their own tools when it comes to generative AI, they may not be able to because of the IP risks unless the proprietary nature of the training set can be secured from the user, which, over the short-term, may not be possible.

This creates a potential conflict between the employee’s need to take the training data with them (the work they’ve put into customizing the tools becomes part of their portfolio) and the company’s need to retain this data because it is proprietary and was created on their nickel.

That's not all. What about tools that allow the user to build a digital clone? Again, the employee should own it, but the company will want it and have solid reasons to retain the clone which could lead to some interesting conflicts.

IP Versus Employee Digital Skills

Imagine working 20 years with generative AI only to change jobs, be laid off, or be fired and find that all that training now had to be redone? I expect there will be bridge tools, and coders will come up with ways to strip the training data they have created and keep it -- but if caught, that training data could compromise the IP of the company and end up creating yet another exposure.

We are only at the beginning of this generative AI wave, and it may force us to rethink how we provision our employees and how we ensure that the training sets the employee is creating to train generative AI don’t become an HR or security problem in the future.

About the Author

Rob Enderle is the president and principal analyst at the Enderle Group, where he provides regional and global companies with guidance on how to create a credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero-dollar marketing. You can reach the author via email.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.