It's disheartening to see that these large healthcare corporations are more interested in mining patient data for advertising than prioritizing patient privacy and trust. The lack of accountability and transparency from these corporations is mind-boggling. When patients opt-out or decline their consent, it should be respected and protected, not used as a loophole for companies to continue sharing their sensitive information. But corporations typically do not police themselves up, so more importantly, healthcare providers must be held to a much higher standard.
In my mind, it is hard to imagine a patient opting-in and asking to receive targeted advertisement based on their medical condition. Should advertising based on sensitive medical data even be legally allowed? It's high time that the FTC step up and enforce stricter regulations on the sharing of sensitive medical data. Patients deserve better.
The amount of big pharma ads on TV and radio is obnoxious. You basically can't consume mass media without seeing 5 ads every 10 minutes for medications. It's insane. I'm with you we need this to stop.
Because the median age of "linear" TV viewers in the US is in the high 50s [0], and of cable news viewers early 60s [1], and of PBS viewers 65. All the kidz have gone to streaming or social media or podcasts [2].
As to the direct-to-consumer advertising (DTCA) for pharmaceuticals, the US and NZ are the only countries which allow it, it's a terrible idea.
So on US TV you get the inevitable intersection of both of those. US pharma spent $6.88b on DTCA in 2021. Congress isn't going to do diddly because it loves insider-trading on non-public information in pharma. Perverse incentives. If you want to avoid the pharma ads, cut the cord or switch to a news source with younger viewer age.
I see judge2020 had already written the same thing just below.
My theory on this is because there are a decreasing amount of people watching broadcast television on-air. As fewer millennials and Gen Z/Zoomers are watching anything live, fewer advertisers are going to see it as a lucrative advertising opportunity, so the only people left still paying big bucks are the pharmaceutical companies, and they get a better deal anyways since a larger percentage of the viewers that see their ad will be >50 and need that medicine.
I think there's also a much higher return on investment for medications than anything else, since they can overcharge so much. One person starting on an expensive drug could single-handedly pay for an ad that a million people saw.
It's definitely a problem with pharma but in general advertising is getting worse and worse. Our brains were not designed to be under _constant_ psychological assault designed to trick us into buying things we don't want or need. Advertising and marketing have been refined as a science in the past 100 years to such a degree that many times you don't even know you're being subjected to it.
You know what I think? I think it's trained my brain to disbelieve and be skeptical of basically everything I hear, and I think it's also trained me to instinctively say "no" and refuse things that I am offered.
Now it is healthy to have skepticism when strangers tell me a whopper, or when I'm offered something too good to be true. But if I'm in this habit and I refuse to believe trustworthy sources, someone in authority, or I refuse a gift of goodwill from someone who truly has my best interests in mind, that is advertising being detrimental in its training my mind and will.
Skepticism is fine, but ads often actually aren't making the claims a casual watch would suggest they are. Unrepresentative testimonials (see small print disclaimers), phony personal experiences, doctors portrayed by actors in white coats (remember "Dr" Victoria Zdrok selling male enhancement pills in 2008; she's a doctor in psychology).
Right? I don't think the point that it shouldn't be happening and something needs done is nullified by this but I'm with you all it's done is made me say "If I firt see it in a commercial, I'm not going to buy it or deal with it because it's likely a scam"
It's disheartening to see that these large healthcare corporations are more interested in mining patient data for advertising than prioritizing patient privacy and trust.
In my experience, and based on the article, it sounds like Phreesia is an outlier.
As someone who works for a large healthcare company, we're scared shitless company-wide about the potential for personal data to get outside our hands. We have an entire department of people just to constantly watch everything that is done — every process, every piece of hardware, every piece of software, everything — to make sure that personal data doesn't go where it doesn't belong. Every employee down to the janitors has training about this at least once a year.
Last year there was a minor data breach. Someone left a small stack of eight forms with people's PII on them on a desk in a conference room unattended for a few hours. It was an all-hands-on-deck five-alarm fire. It wasn't in my department, so I don't know what the result was or if any employees were disciplined. But seemingly every time another healthcare company gets fined for violating HIPAA, we get a company-wide memo about it as a reminder.
I haven't heard of Phreesia before now. It sounds like one of those scummy companies that sells cheap "office solutions" to small companies (in this case doctors offices) that don't have the resources to do things right and in-house.
People should complain to the proper authorities about Phreesia, or any other company that acts this way. HIPAA complaints are taken very seriously in the United States. Your doctor's office may act like it doesn't care, but the feds do, and then the doctor's insurance company sure as hell does. One way or another, the lazy/greedy/cheap doctor will pay. But you have to document, and report, not just moan anonymously online. That does nothing.
Document and report. Don't let the companies get away with it, or they will keep getting away with it.
This is the magic of "corporate compliance". A corporate organization can only add more forms, checks, policies. They can have both a big fire drill for a thing that was left in a conf room one time, as well as leak huge amounts of sensitive data through routine operations.
They can strictly require X, Y, and Z, while Z is not really necessary and X is really important to make sure it's done right. Later they'll add Xv2 and Xv3 signed in triplicate to ensure X is done right, but still won't really ensure that X is done right. That's just not what a bureaucracy is able to do. They can't just fire everyone and then re-hire only people who care enough about doing things right, that's impractical. So they can add more forms, policies, fire-drills ...
Side note, most people don't care about the "EULA" or "privacy policy" on everything, it's just for "compliance", 99.9% of people never read them, no one expects you to read them, you just sign, that's just how everything works. Yeah it's dumb. But you can see why pretty much all practical every-day people ignore the implications.
> It sounds like one of those scummy companies that sells cheap "office solutions" to small companies (in this case doctors offices) that don't have the resources to do things right and in-house.
It doesn't take a large company to not mine data that you're supposed to safe-keep. The problem there is not the lack of internal overseeing, but that they departed from their original administrative business, and went into advertising because "profit".
There’s a big compliance pain in the ass, and you’re training janitors about stuff they probably don’t need to know. But you’re probably sending faxes to providers, and while I’m sure your company does a good job with controls, there’s no doubt a firehose of “anonymized” marketable data going out the door.
There’s a big compliance pain in the ass, and you’re training janitors about stuff they probably don’t need to know.
I can tell that you don't work in healthcare, or with janitors, or you'd know that they need to know most of this stuff.
It's the janitors that have the keys to everything. It's the janitors that know which bins go to the sanitation department and which ones go to the secure shredding company. It's the janitors who walk around the building at night and notice if someone's left papers on their desk after they went home (another one of those four-alarm-fire events), or a computer didn't lock itself, or someone dropped an access badge, or something else is amiss that could cause problems.
there’s no doubt a firehose of “anonymized” marketable data going out the door.
Please specify how it is that "there's no doubt." Do you have some evidence, or are you just being a cynic because that's the way anonymous wannabe cool kids act on the internet?
If you have evidence let me know, and I'll hook you up with our department that handles this sort of thing so they can take a look at it.
I’ll tell you about about the time my wife almost bled out due to a ruptured ectopic pregnancy, and mysteriously received a “Welcome Baby” box from Enfamil about 38 weeks from the day she likely became pregnant.
How did that happen? How did they know?
Well, each doctor’s flow of prescriptions is sent to an aggregator in near real-time, sometimes before the pharmacy gets it. That’s how they know to offer Dr. A a trip to Vegas and Dr. B a sandwich.
Hospital admissions, OB admissions and doctor fees are events that get sent to insurers for payment and subrogation. These vendors are allowed to “anonymize” (aka remove name and SSN, etc) and sell that data.
So you can buy marketing lists that contain, by zip code, addresses with inferences about medical conditions, events, and other metrics. I bought the list and discovered three pregnancies with projected due dates, 6 Type-2 diabetics, 2 people getting married in the next 6 months, etc on my block. And two babies were born, as projected.
There is a difference between compliance, privacy and security. The controls that I’m sure you’re diligent in implementing don’t address many privacy issues. HIPAA was enacted in the 90s and doesn’t address the risks today.
> It's disheartening to see that these large healthcare corporations are more interested in mining patient data for advertising than prioritizing patient privacy and trust.
Can I ignore half the story and just focus on those fucking check-in tablets?
The first time a clinic asked me to do a pre-checking using a web page, I did it, filled out about 75 questions in some mediocre web app and then I was ready when I went to the doctor. The problem came the second time I went, I had to answer the same 75 questions again, so I gave up. When I got the the doctors office they asked me "same insurance? same address?" -- two questions and then I was in! I don't know why the electronic check-in requires 75 questions, but when checking in through a person they ask just 2 questions. I think the staff at my usual clinic might just be slow to enforce our new electronic overlords, because when I went to another clinic run the by same company they just hand me a tablet every time, they wont even ask the two questions. Apparently the receptionists at some locations have one job, and that is to hand people tablets; I wonder how long that job will last?
That seems like an unfair characterization of the front desk staff's job. In the pre-iPad days I remember being asked to do those sorts of questionnaires, but on paper on a clipboard.
People will often tell the tablet something important why won't bring up in person. Same they they might tell the nurse something they don't tell the.doctor. or maybe they tell the doctor not the nurse.
A healthcare provider I used in the past had these things, absolutely hated how they used dark patterns to try to get people to consent to this crap. Really wish the US had more backbone to fight these kinds of practices.
I have my primary care provider and a specialist both in the same health network. PCP check-in is trivial (and through the same service). But at the specialist, the only way to complete the check-in process is to check a box allowing them to contact friends and family in pursuit of collecting money for unpaid bills. It specifies that it's not required, but I also cannot complete the check-in process on the tablet without checking. Such a bad dark pattern.
I have previously checked the box, then ask the receptionist to immediately remove my authorization and print a confirmation. And it's not just unchecking the box, she's adding an exception note on my account. I doubt it'd be respected correctly if I ever had a bill go to collections.
I do raise a stink with the receptionist every time and she's good about it, but should probably escalate it higher in the organization.
>I do raise a stink with the receptionist every time and she's good about it, but should probably escalate it higher in the organization.
This is the part that sucks. It's not the receptionist's problem, he/she just wants to get on with their day. Generally they're powerless to do anything anyway.
Last time I went to urgent care they had one of those screenless digital signature collection pads. "I just need you to sign a couple intake forms" and she motioned at the pad. I asked what I am signing, and she said "just some intake forms." She looked at me like I had six heads when I asked to see the actual forms, as if no one had ever done that before, then took a few minutes to find them.
Felt bad for making her life harder. Makes me wonder if something like that is even contractually valid: how can you collect a signature "on" a form that you don't even show the signer?
This happens to me all the time. While most have accommodated me in requests to read printed forms, this seems pointless because no matter how much paperwork I read, I seem unable to properly interpret it, and it's a take-it-or-leave-it situation anyway.
I was in one situation where I was required to "sign for my belongings" before they were returned to me--and before I was allowed to see them at all--and the document on the screen described someone else's belongings, such as things I did not have. The chick told me to calm down and I asked her if she read her employment contract before signing it.
They’re some weird millennial idea that’s taken hold over the last ten years that when you get terrible service somewhere it’s in bad taste to voice that to the representative of the company you’re currently with, to save them the hassle. Company gets off for free now because we can’t hurt the feelings of someone who works for them. The whole Karen thing, can’t be a Karen.
That's not what being a Karen is at all. Being a Karen is largely about being aggravated at employees with no actual decision-making power in the situation or wasting people's time with demands that are obviously going to be pointless. I don't think most people would call you a Karen for wanting to read intake forms. What they might call you a Karen for is giving a flippant statement like "Well why not?!" when you ask the receptionist to explain everything on the forms in perfect legal detail and she says she can't.
So the company doesn’t train their employees to properly explain required forms - totally legit thing to complain about, but you can’t?
Back in the day, the employee would expect the complaint, likely agree with it and not take it personally as it’s very obviously not directed at them or within their power to change. However, what they can do it let their employer know customers are unhappy.
> This is the part that sucks. It’s not the receptionist’s problem
It’s the office’s problem, and the receptionist is the public interface for the office. (Of course, if you aren’t willing to do anything but complain to the receptionist, and it doesn’t effect the business of the office, then I’m nots sure what the point of repeatedly raising a stink with the receptionist is.)
I haven't and I hope never to get involved in a legal battle with a doctor. I'm just trying to get on with my life. But if they try something funny and it comes down to that, I can't imagine this would leave me any more exposed than having signed my name.
"entirely run by capital" and "not at all run by capital" are not the only two options. There's a very wide spectrum between the two. Again, GDPR is an example of this; so's the Norwegian sovereign wealth fund versus, say, Alaska's oil dividend.
Hey there! It looks like your attempt to exercise agency hasn't quite panned out. No worries though, the techobureaucratic complex can be a tough cookie to crack. Keep at it and don't hesitate to reach out for help or guidance!
(translated from "Get owned, fleshie. Your attempt to exercise agency has proven futile. Now supplicate yourself before the techobureaucratic complex.")
Because all health software companies collect this stuff. Worse, they play health providers off each other and bully them where if the small provider doesn't acquiesce to the software company selling their patient data, they get cut out of the care network. It's egregious.
There is a situation in Canada happening with one vendor in particular, and they have the support of doctors (who often hate privacy because it's a limit on their discretion, and it creates expensive bureaucracy to manage because everyone involved also hates limits to their discretion), but all the other small care providers who make up the bulk of health care services are basically principled people in small settings who want to help their patients. I've seen how they get bullied, and I don't really know what the solution is.
Can you explain what you mean about privacy being a limit on the doctor's discretion?
I want my doctor to have all my medical information available for making decisions about my treatment. I don't want them to share my information beyond what is necessary for that care (and inevitably for billing/insurance too)
Privacy laws limit the ability to share PHI to very specific purposes and contexts, with named people. For someone to get access to your PHI, they need to get your consent first, this is either explicit with a signature, or "implied" through a variety of purposes, particularly for health system billing and improvement.
There are lots of government departments that want to use health information for everything from identity cards to licensing, to general "social science" research, and they are prevented from looking up whether you've had an abortion or not because these rules are in place. They'd really like to do those things, but they can't.
A doctor can't just share your husbands last STD test date, or your wife's fertility status with you, even if they have the information. They also can't lookup their own partners information without it creating logs and regulatory questions about their conduct. I've heard dozens of stories of nurses saying, "We can't use your previous test results because they are protected by privacy so I have to take the sample again," with an eye roll. Generally, they resent having to get patient or substitute decision maker permission.
It means they can't just txt a photo of your file with their android phone because it's convenient, and they need your permission and informed consent to use it. I get the "I save babies, do you want dead babies or do you want your so-called privacy you selfish prick?" argument, but they were doing that before the internet, and a society where a class of people can abuse your health information and use it to harm you is a bigger problem than charitably a tiny 4th derivative change in the trajectory of infant mortality.
General health research purposes have not been allowed previously, though there were some recent post-pandemic changes that enabled sharing of all health records with unspecified researchers, but privacy laws have typically got in the way of aggregating patient data or sharing it with peers and specialists without getting a patient signature etc. I've sat on calls with academics who were openly derisive of individual privacy because they thought that using peoples health records to train machine learning systems was their annointed right.
I don't have a lot of short answers. I've spent almost two decades on the specific issues related to health information privacy, and healthcare provision in many ways is social governance by other means, so there aren't a lot of short answers.
I worked at a “digital healthcare” startup. You’d be surprised at the abuse of confidential patient information. They sign BAAs with hospital systems promising to anonymize and provide analytic services to them. While doing so they “pocket” the raw data. Data that is pilfered to these digital healthcare startups is never coming back or erased. HIPAA is too weak to cover these scenarios. Adding insult to injury a lot of these startups have developers in other countries handling confidential patient information.
> While doing so they “pocket” the raw data. Data that is pilfered to these digital healthcare startups is never coming back or erased. HIPAA is too weak to cover these scenarios.
Er, no, its not. HIPAA. absolutely covers this scenario—any PHI they get under a BAA from a covered entity remains PHI and they are restricted in their use just aa if they were a regular covered entity for as long as they have it, and any unauthorized use or disclosure is unlawful and, if wilfull, criminal under HIPAA.
> I worked at a “digital healthcare” startup. You’d be surprised at the abuse of confidential patient information.
Why would I be surprised? The default operating assumption should be that they're absolute scum. I don't understand this game where we all play make-believe that companies are virtuous until proven otherwise.
Forgot to mention -- often in real time, streaming hot off the EMR over nice friendly XML or HL7! For all you know, the ADT messages already hit the company before you ever even saw a privacy disclosure to sign.
How did they decide that this data doesn't need to be protected? If it's PHI (at the covered entity) and accessed through their BAA, it's still PHI and deserves protections. Were they willfully ignoring their obligations?
> At the end of one visit after the birth, my physician asked if there was anything else. I hesitated. Privacy injuries are intangible and diffuse, and women often report that even their physical pain is dismissed as unreal across health care settings. I briefly mentioned I was having trouble obtaining a phantom Phreesia authorization form. Kindly, she wrote her personal email down and the name of their practice’s CEO, and offered the very useful advice that I should go through patient services, not medical records.
I hate they way healthcare has become so corporate. In the past, the doctor you saw owned the office themselves or with a few fellow doctors. Now, they are just a cog in a larger corporate machine with a CEO.
The issue of consent kinda seems to point towards the medical practice.
> Phreesia attributes a source of their confusion to a blank authorization form they received on one occasion where the staff checked me in manually
The focus of this article should be on the wild fact that someone else signed a consent form on their behalf to have their medical data shared...
Dark patterns are certainly an important issue, but should anyone other than the patient themselves even be able to fill out a form like this in the first place?
We need a GDPR in the US. Yesterday. All the people who treat our private data as their own will stop and think twice if they face the chance of losing billions for breaking privacy laws.
And every data collection should be turned from opt-out to opt-in - by law.
"I care deeply about privacy" -- the first line of the article.
As someone who works in the space, I find it kind of annoying that we use the word "privacy" to mean: asking and following up with tech companies to ensure their opt-out behaviors work as intended. It's like... isn't privacy really about whether you are sharing what you do with people? If you care deeply about privacy, should you be online at all?
I get that we might want to encourage online businesses to do a better job implementing the web of privacy laws... but self-labeling as passionate about it just kind of annoys me.
According to the article, the author did not give up their personal info online, they were required by their doctors to provide or decline consent for selling their medical data after physically arriving at the doctor’s office as a prerequisite for obtaining the medical assistance they needed.
Unfortunately, the more onerous you make the requirements for handling a type of data, the more the people who need to handle it will tend to outsource that handling to compliance-in-a-box providers whose bona fides they won’t be willing or perhaps even qualified to check, the fewer such providers there will be, and the more incentive the remaining ones will have to pull malicious stunts like this.
Less regulation, a free-for-all. More regulation, regulatory capture. I don’t see how you can win here.
Simple regulation, stringent enforcement with licence to punish loophole-seekers? A nice thought, but this too tends to evolve towards the BIG BALL OF MUD equilibrium as lawmakers try to improve the situation for one or another constituency, and there’s no counterbalancing force towards simplicity. Or rather there is—things gradually clog up and become less and less effective—but it works on scales far beyond the planning horizon of a legislator or the attention span of a voter. (See: the tax code just about anywhere.)
If you're suggesting that the author could have avoided all of this by simply not using online services, don't these two lines address that? Emphasis mine:
> At both providers, upon my arrival the staff would hand me a tablet made by Phreesia, a company with a roughly $1.7 billion market cap, to check in.
and:
> A patient seeking a long-awaited appointment with a specialist isn’t going to cancel, even if they are uncomfortable, because getting care is the priority.
It looks like people are getting confused by your comment.
When you say you are "someone who works in the space" what exactly do you mean? Healthcare, or the tech companies that buy healthcare data, or some kind of patient privacy advocacy role? or even someone who makes the dark patterns?
When you say "sharing what you do with people" are you talking about the patient sharing information with their doctor, or are you talking about the hospital sharing patient information with tech companies?
This is also unclear "self-labeling as passionate about [privacy] just kind of annoys me". Are you annoyed by patients who are concerned about their privacy? Or are you annoyed by someone whose job is to make the dark patterns and when they say they are 'passionate about privacy' they really mean 'they are passionate about opportunities in the privacy space' and really 'passionate about dark patterns to subvert privacy'?
> If you care deeply about privacy, should you be online at all?
Er I’m not following this logic at all. If you care passionately about something that is at odds with current society, then you are no longer supposed to be part of this society? For example if I fundamentally disagree with capitalism that doesn’t mean I’m somehow absolved of getting a job and paying bills, we are all still human beings with basic needs that need to be met.
Sounds to me like an erudite being disgusted at the luddite's labelling because it somehow lowers the value of their own title.
You don't have to work in privacy and know how to label data to have a genuine care about companies not mining you. I care deeply about my health but I'm not a doctor and I sometimes eat chips.
In my mind, it is hard to imagine a patient opting-in and asking to receive targeted advertisement based on their medical condition. Should advertising based on sensitive medical data even be legally allowed? It's high time that the FTC step up and enforce stricter regulations on the sharing of sensitive medical data. Patients deserve better.