A new report accuses Talkspace of launching mobile therapy of mining the data from clients’ private therapies conversations. If true, the accusation raises serious ethical questions about the tech company’s respect for patients’ rights and the understanding of the strict ethical rules that govern patient confidentiality.
Former employees and therapists at Talkspace told The New York Times that anonymous conversations between medical professionals and their clients were regularly monitored by the company so that they could contact them for information. Because the text calls are considered medical records, users can not delete the transcripts. One therapist claimed that when she referred a client to sources outside the Talkspace app, a business representative told her that she should advise clients to continue using Talkspace – even though she says she had not disclosed that conversation to anyone at the company. The company claimed that the conversation may have been marked due to algorithmic review, according to The Times.
A few former employees claim that Talkspace data scientists monitored clients’ transcripts so that they could find common phrases and mine them to better target potential customers. Talkspace denied that they mined data for marketing purposes. Similarly, a number of therapists told The Times that Talkspace seemed to know when clients worked for “enterprise partners” such as JetBlue, Google and Kroger and would pay special attention to them. One therapist claims that when they thought it took too long to respond to two Google clients, the company excelled and expressed their concerns. The Times also reported that the company was working on bots to catch clues that a client may be in need, apparently to help therapists catch red flags they might otherwise miss.
A Talkspace spokesman denied the allegations. “We pay special attention to all of our business partners and their employees, just as we do any consumer client,” John Reilly, General Counsel at Talkspace, told Salon by email. “The main difference is that entering into a large business account is a bit more complicated than correctly agreeing with one person, so we have comprehensive implementation protocols and implementation managers for each large business client to ensure a smooth transition at the beginning of any relationship.”
As for the collisions, Reilly told Salon that “we are providing our Therapist network with an array of analytical tools for their digital practice. One program will view the encrypted text to alert the therapist to language that a client can point to. emerging problems or escalating language. trends. “
The Times marked the story of a man named Ricardo Lori, who was hired into the company’s customer service department after being an avid user for years. When an executor asked him to read excerpts of therapy logs for staff to give them a better impression of user experiences and assured him that he would remain anonymous, he agreed. However, after the presentation, the Times reported that Lori’s trust had been betrayed.
When Mr. Lori drank and saw a tall glass of red wine, he saw that a few employees were staying his way. Afterwards, a member of the marketing department approached and asked if he was OK. Later, Oren Frank, Mrs. Frank, and the CEO thank him in the elevator. Somehow, word had spread that Mr. Lori was the client in the reorganization.
In response to these allegations, Talkspace wrote in a post on Medium that many of her answers to questions from Times interviews did not make it into the final story, and that a leading clinician who supports Talkspace also did not include his answers in ‘ the story.
Despite the evidence and sources of the company claiming, Reilly denied that Talkspace had data-mined transcripts for marketing, claiming that only data removed from identifiable user information was used for quality control.
The Health Insurance and Accountability Privacy Regulation (HIPAA) imposes strict rules on healthcare providers when it comes to sharing patient information. It specifically states that health care practitioners can only share medical information with one another and only for the purpose of providing adequate medical care to their patients; that they are not permitted to accidentally disclose private medical information to the general public; that patients have the right to view and correct their records as necessary; and that personal medical information may not be disclosed so that providers may improve their marketing.
“If it’s true that Talkspace used information from private therapy sessions for marketing purposes, that’s a clear breach of trust with its customers,” Hayley Tsukayama, a law enforcement activist for the Electronic Frontier Foundation, told Salon via email. “All businesses need to be very clear with their customers about how they use personal information, make sure they do not use information in ways that consumers do not expect, and give them the opportunity to finally get basic consent for those purposes. “Talkspace is concerned about its reliability and often mentions privacy in its advertising campaigns. Its actions must be in line with its promises.”
Talkspace has previously come under fire for its ethics, work practices and effectiveness. The company was also accused in 2016 of forcing therapists to use scripts that promote Talkspace services, do not lack sufficient plans for patients at risk, and control conversations between therapists and patients. (That story was originally reported by The Verge.) The company was also accused of threatening legal action against therapists who tried to establish relationships with clients from the platform, which CEO Oren Frank said only happened “in a few very unusual cases “because the therapists are thought to be engaged in” serious ethical violations or potentially dangerous communication. “
Like other online therapy apps, Talkspace is not covered by most health plans, and generally costs hundreds of dollars in out-of-pocket costs for a regular subscription. Therapists who spoke with Salon last year about online therapy apps like Talkspace said they paid therapists poorly and hid the actual salary. That reflects what many gig workers have seen with similar contingent-based companies like Uber, DoorDash and Lyft: that employers are raising real wages to make their gig work more attractive.