Not Quite Ready Yet
Good morning. The Iran War grinds on, and oil prices remain elevated this week. The stock market has mostly been looking past this because corporate profits have been coming in strongly during this earning season. That has helped investors flip their attention away from war and back to growth prospects for AI.
As I’ve mentioned before, I’m cautiously optimistic about AI capabilities but I don’t use any subscription services when working for you. I don’t use them personally either. Instead, I’m trying to understand as much as I can about how these services work, where they’re headed, and the benefits and risks we’re likely to face along the way. At this point, I want to get excited about these technologies but the more I learn the more I seem to want to pull in the reigns a bit.
Part of this has to do with how quickly we tend to adopt new technologies while ignoring, or maybe glossing over, privacy concerns. Another issue is how companies are monetizing (or are planning to monetize) their growing hoard of customer and client data. While this practice makes perfect sense for AI providers because that’s a core part of their business, it makes me squeamish when other companies on down the food chain start looking at their client data as an asset to leverage. Fundamentally, turning proprietary data into a competitive advantage isn’t new but the scale and speed at which data can be mined and distributed to third parties is massive.
It’s not earthshattering to suggest that implementing AI services across our economy will require us to redefine our concept of privacy. After all, our data goes into helping the technology help us, and to a certain extent it’s necessary to “share”. Are we okay with our data being used by the companies we do business with (and those they do business with) in various ways if it’s anonymized? If so, can we ensure that it is always anonymous? I’ve read about how it’s relatively easy for businesses and fraudsters to de-anonymize data, which ironically could be made faster by using AI. We already trade some of our privacy for convenience, so the slippery slope seems to only steepen.
Many industries have to comply with privacy rules tied to government regulations. The healthcare industry is one example, as is my industry, financial services. Both must follow HIPAA privacy rules that require the protection of patient/client data, and my industry also has the SEC and state regulators to contend with. The fundamental goal of the various rules makes good sense: safeguard data, use it only for legitimate business purposes, and don’t disclose it without prior authorization or when legally required. But there are always loopholes for those who choose to exploit them. Does anyone think the regulators can keep up with these technologies? Has that ever been the case? Usually, they respond with new regulations after major breaches hit the news anyway.
In my own business I’ve always tried to take the simple approach that every client household (and their data) is in a guarded silo where access is only granted by permission. We use encryption and other processes to keep the silos locked down. That’s almost old school these days and simple in theory but quickly gets complicated in practice because of the prevalence of cloud-based software. Still, each connection to the client’s silo is purposeful and, compared to the capabilities AI is promising, pretty straightforward.
That’s evolving as AI-based notetakers are taking hold in my little corner of the industry. These services record meetings and phone calls and can even reply to emails from clients. The providers conform to privacy regulations and give planning firms the ability to delete client data on demand. They also anonymize client data but it’s unclear, both from website verbiage and unrelated stories like I’m linking to below (and, admittedly, a little bit of paranoia), that the data will remain anonymous.
Overall, I’m optimistic about AI but still feel uncomfortable giving access to everything you say in a meeting, everything in your file, and all of your account information, even if it can be anonymized and I can delete it later, to a third-party. Maybe I’m missing out on building a private chatbot to answer your emails, but that’s okay. I like replying anyway and wouldn’t dream of handing that off to technology. There may come a time when I leverage these technologies in other ways. Until then, my preference is to keep my protection of your data as simple and secure (and in as few hands) as possible.
Consider that with stories like I’m sharing below and you might get a better sense of why I’m so squeamish. It’s a reminder of how our data could ultimately be used in unexpected ways.
https://www.proofnews.org/womans-talkspace-therapy-app-sessions-exposed-in-court/
Have questions? Ask us. We can help.
