Facebook is conducting MRI research with NYU Langone


Illustration for article titled Please Keep Mark Zuckerberg Away From My Bones

Photo: Mandel Ngan (Getty Images)

Most of us might think of Facebook as the social network you choose suburban moms en conspiracy theorists, but the company is not ashamed to deepen to become many more then an app on our phones, even if that’s the last thing we want. Here’s an example: earlier today, Facebook launched a business blogpost describes his latest venture, this time in the wild world of medicine.

As the post explains, the company is AI-research wing—Call FAIR – has been quietly working alongside professionals at the NYU Langone Health Center for the past two years to create what they call fastMRI: an algorithm that promises to cut down on the long-as-hell process of people who typically undergo as stepping into an MRI machine. They just need some pictures of your bones to do it.

Okay, not all of your bones – at least not yet. The first round of research for the fastMRI program, at least so far, is exclusively based on a massive open source library of images from many knee MRIs that NYU offered helpful for the sake of the project. By learning a machine algo on these knees, the team was able to create an algorithm that can determine an accurate MRI image with a quarter of the data that your typical MRI machine has takes in by setting up a crystal clear image of your bones or brain as what you have. Or put another way: because an algorithm here does the heavy lifting, you need to spend less time photographing when you are trapped in a strange, noisy metal tube.

You see, the main reason that any given session in an average MRI machine can end up lasting more than an hour comes down to the way these machines work in the first place, that’s … a bit complicated explain. In short: if a machine is, say, taking a scan of a person’s head (or brain), then that means super strong magnetic forces on a person’s head, hitting that head with a radio current, and then a form composite image, based on the behavior of the god-knows-how many protons in that person’s head, once they are subjected to these types of signals. As it turns out, the signals that these protons give up can increase very weak, which means that the whole debris may have to be returned again and again to form a crystal clear composition.

Using AI to cut down on the time it takes to get that definitive image is not a new idea at every stretch, and I’ll be the first to admit that it sounds like a great idea – until you remember that Facebook is one of the names behind this particular project. This is a company of which crazy growth is largely based on collecting our data, bundling this data, and then by sliding to third parties such as great advertisers of federal agencies. And that is not, only massively, inclusive data breaks the company somehow remains at the center of.

And just like Facebook’s ambitions as a platform, its data resources are also rapidly deepening: Facebook not only knows what we do on its platform as well as on Instagram, it also knows what we buy, where we buy it, and, well, in tons of other things, thanks to his oodles of collaborations in a buffet of industry, including – surprising surprise—great pharma and the medicine.

Only this year, the platform made a noticeable push to court ads from major medical brands, and it has been to work, thanks in part to the pharma data we have been releasing giving up about ourselves online. And if that part of our medical history is used for targeting, then there is, there is not much to stop Facebook doing the same with another type of medical dataset, even one that comes from our literal bones and organs. In general, these types of details should be treated under legislation such as HIPAA, but if we have dealt with it before, the lines between what can and cannot be monetized get somewhat blurred, depending on whether these data are from a doctor or a tech company. And in the case of something like fastMRI – or something like that Really of the alphabet– We have a collaborative relationship between public health and privatized tech, which means that HIPAA may not be protecting these MRI scans as much as we hope.

Facebook undoubtedly saw some potential inconvenience before launching this blog post, as the company sniffed this little disclaimer to the center:

(The fastMRI data used in the project, including scans used for the research, are from the open-source dataset NYU Langone created in 2018. Before the data was open, NYU Langone that all scans were de-identified, and no patient information was available to reviewers or researchers working on the fastMRI project. No Facebook user data was contributed to the creation of the fastMRI dataset.)

Okay, it seems like these bone scans aren’t used for tracking and targeting – at least, not yet – but the team’s own blog post makes it sound like the fastMRI project doesn’t just stop with photos of strangers’ knees. “Today’s clinical study is an important step forward, but much more needs to be done. Next, researchers from Facebook AI and NYU Langone want to show that fastMRI works just as well with other vital organs, such as the brain, ”Facebook wrote.

Even if we delete the whole “my bones are being used against me” story (which, I will admit, is more speculative than I prefer), there are still a ton of reasons you do not want this company anywhere in ‘ near your medical records. This is a company that repeatedly shows that it will place its profit margin for the safety of its users, no matter how much they try to claim otherwise. Hell, the same day that Facebook released this AI investigation, reports emerged that it is the company still evidence against about his role in the wave of genocides in Myanmar back in 2017. Here in the states, the company continuous stride take any action against the antivaxx groups that have already caused at least death of one child. And, of course, that’s all there is to it goat killing thing.

I’m not saying that Facebook’s medical research will really help anyone, but I’m saying that maybe FastMRI would be less icky if it was spearheaded by absolutely every other company.

.