MTC: With AI Creeping Into Our Computers, Tablets, and Smartphones, Lawyers Need to Be Diligent About The Software They Use.

Lawyers need to be weary about the computer company behind the curtin as to what information they are taking from your data!

As Apple is anticipated to announce a new iPhone with AI baked into its operating system, lawyers, like Dorothy in the Wizard of Oz, can no longer stand idly by and trust that the person behind the curtain, i.e., the software creator or owner of their software product, is both trustworthy and not going to use the customer’s data in ways inconsistent with the data owners’ objectives or to protect their data personal identification information. Per ABA Model Rule 1.6(a), lawyers must reasonably ensure that their client’s Personal Identification Information (PII) is protected. And recent events are providing a bit of a minefield for not just lawyers.

I use a popular subscription service application called SetApp. It’s a subscription service that gives me access to over 240 applications. I use many of them daily. But one of its applications, Bartender (which helps clean up and manage your Mac computer’s toolbar), was recently but secretively purchased by a private company. The problem is that little is known about the company. There is a very legitimate concern that Bartender may be improperly using its customer’s computer data – apparently (but not confirmed to be) making unauthorized screenshots. (Note that this is not a critique of SetApp, but I am going to reevaluate my use of Bartender – here are some alternatives you may want to check out.) But this general concern does not end with just “unknown” Wizards.

Lawyers need to be weary about the computer company behind the curtin as to what information they are taking from your data!

It was recently discovered that Adobe changed customer's terms of service. Lawyers should be deeply concerned about Adobe's updated terms of use for Photoshop, which grant the company broad rights to access and remove users' cloud-stored content. This raises significant privacy and confidentiality issues, particularly for legal professionals handling sensitive client data under non-disclosure agreements (NDAs), protecting PII, and trial strategies. Adobe's ability to view and potentially mishandle files covered by NDAs could lead to damaging leaks and breaches of client trust. You can “opt out” of this by going to your account’s privacy settings, going to “Content analysis,” and making sure the “Allow my content to be analyzed by Adobe for product improvement and development purposes” option is not selected. You can also not upload your material to Adobe’s could service – these steps may provide an extra layer of protection, but no one is 100% sure.

As custodians of confidential information, lawyers have an ethical duty to safeguard client secrets. Adobe's overreaching policy raises significant concerns for the legal community. These concerns extend beyond software, as computer companies now integrate AI into their hardware systems.

Many Windows machines are developing their computers to work inherently with MS Windows' own AI, Copilot. At the time of this writing, Apple is expected to announce a new operating system with an AI built into it to work with its new M4 chip. In other words, hardware and software companies work together to have their machines work naturally with operating systems that have AI built into their software. The biggest concern that should be on lawyers' minds is how their data is being used to train a company’s AI. What protections are being built into the systems? Can users opt-out? What does this all mean for us lawyers?

This means that lawyers at any computer skill level must pay attention to the Terms of Service (ToS) for the computers and software they use for work. The warning signs are there. So, stay tuned to your Tech-Savvy Lawyer as we navigate through this together!

MTC