Tag Archive for: Least

If A College Is Going To Make COVID-19 Contact Tracing Apps Mandatory, They Should At Least Be Secure

One of the more frustrating aspects of the ongoing COVID-19 pandemic has been the frankly haphazard manner in which too many folks are tossing around ideas for bringing it all under control without fully thinking things through. I’m as guilty of this as anyone, desperate as I am for life to return to normal. “Give me the option to get a vaccine candidate even though it’s in phase 3 trials,” I have found myself saying more than once, each time immediately realizing how stupid and selfish it would be to not let the scientific community do its work and do it right. Challenge trials, some people say, should be considered. There’s a reason we don’t do that, actually.

And contact tracing. While contact tracing can be a key part of siloing the spread of a virus as infectious as COVID-19, how we contact trace is immensely important. Like many problems we encounter these days, there is this sense that we should just throw technology at the problem. We can contract trace through our connected phones, after all. Except there are privacy concerns. We can use dedicated apps on our phones for this as well, except this is all happening so fast that it’s a damn-near certainty that there are going to be mistakes made in those apps.

This is what Albion College in Michigan found out recently. Albion told students two weeks prior to on-campus classes resuming that they would be required to use Aura, a contact tracing app. The app collects a ton of real-time and personal data on students in order to pull off the tracing.

Aura, however, goes all in on real-time location-tracking instead, as TechCrunch reports. The app collects students’ names, location, and COVID-19 status, then generates a QR code containing that information. The code either comes up “certified” if the data indicates a student has tested negative, or “denied” if the student has a positive test or no test data. In addition to tracking students’ COVID-19 status, the app will also lock a student’s ID card and revoke access to campus buildings if it detects that a student has left campus “without permission.”

TechCrunch used a network analysis tool to discover that the code was not generated on a device but rather on a hidden Aura website—and that TechCrunch could then easily change the account number in the URL to generate new QR codes for other accounts and receive access to other individuals’ personal data.

It gets worse. One Albion student was able to discover that the app’s source code also included security keys for Albion’s servers. Using those, other researchers into the app found that they could gain access to all kinds of data from the app’s users, including test results and personal identifying information.

Now, Aura’s developers fixed these security flaws…after the researchers brought them to light and after the school had made the use of the app mandatory. If anyone would like to place a bet that these are the only two privacy and security flaws in this app, then they must certainly not like having money very much.

To be clear, plenty of other schools are trying to figure out how to use technology to contact trace as well. And there’s probably a use for technology in all of this, with an acceptable level of risk versus the benefit of bringing this awful pandemic under control.

But going off half-cocked isn’t going to help. In fact, it’s only going to make the public less trustful of contact tracing attempts in the future, which is the last thing we need.

Techdirt.

New Study Suggests That YouTube’s Recommendation Algorithm Isn’t The Tool Of Radicalization Many People Believe (At Least Not Any More)

It’s become almost “common knowledge” that various social media recommendation engines “lead to radicalization.” Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because “that’s how you sell more ads” and that nothing I could say could convince them otherwise. Thankfully, though, there’s a new study that throws some cold water on those claims, by showing that YouTube’s algorithm — at least in late 2019 — appears to be doing the opposite.

To the contrary, these data suggest that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels….

Indeed, as you read through the report, it suggests that YouTube’s algorithm if it has any bias at all, it’s one towards bland centrism.

The recommendations algorithm advantages several groups to a significant extent. For example, we can see that when one watches a video that belongs to the Partisan Left category, the algorithm will present an estimated 3.4M impressions to the Center/Left MSM category more than it does the other way. On the contrary, we can see that the channels that suffer the most substantial disadvantages are again channels that fall outside mainstream media. Both right-wing and left-wing YouTuber channels are disadvantaged, with White Identitarian and Conspiracy channels being the least advantaged by the algorithm. For viewers of conspiracy channel videos, there are 5.5 million more recommendations to Partisan Right videos than vice versa.

We should also note that right-wing videos are not the only disadvantaged groups. Channels discussing topics such as social justice or socialist view are disadvantaged by the recommendations algorithm as well. The common feature of disadvantages channels is that their content creators are seldom broadcasting networks or mainstream journals. These channels are independent content creators.

Basically, YouTube is pushing people towards mainstream media sources. Whether or not you think that’s a good thing is up to you. But at the very least, it doesn’t appear to default to extremism as many people note. Of course, that doesn’t mean that it’s that way for everyone. Indeed, there are some people criticizing this study because it only studies non-logged in user recommendations. Nor does it mean that it wasn’t like that in the past. This study was done recently, and it’s been said that YouTube has been trying to adjust its algorithms quite a bit over the past few years in response to some of these criticisms.

However, this actually highlights some key points. Given enough public outcry, the big social media platforms have taken claims of “promoting extremism” seriously, and have taken efforts to deal with it (though, I’ll also make a side prediction that some aggrieved conspiracy theorists will try to use this as evidence of “anti-conservative bias” despite it not showing that at all). Companies are still figuring much of this stuff out and insisting that because of some anecdotes of radicalization that it must always be so, is obviously jumping the gun quite a bit.

In a separate Medium blog post by one of the authors of the paper, Mark Ledwich, it’s noted that the “these algorithms are radicalizing everyone” narrative also is grossly insulting to people’s ability to think for themselves:

Penn State political scientists Joseph Philips and Kevin Munger describe this as the “Zombie Bite” model of YouTube radicalization, which treats users who watch radical content as “infected,” and that this infection spreads. As they see it, the only reason this theory has any weight is that “it implies an obvious policy solution, one which is flattering to the journalists and academics studying the phenomenon.” Rather than look for faults in the algorithm, Philips and Munger propose a “supply and demand” model of YouTube radicalization. If there is a demand for radical right-wing or left-wing content, the demand will be met with supply, regardless of what the algorithm suggests. YouTube, with its low barrier to entry and reliance on video, provides radical political communities with the perfect platform to meet a pre-existing demand.

Writers in old media frequently misrepresent YouTube’s algorithm and fail to acknowledge that recommendations are only one of many factors determining what people watch and how they wrestle with the new information they consume.

Is it true that some people may have had their views changed over time by watching a bunch of gradually more extreme videos? Sure. How many people did that actually happen to? We have little evidence to show that it’s a lot. And, now, there is some real evidence suggesting that YouTube is less and less likely to push people in that direction if they’re among those who might be susceptible to such a thing in the first place.

For what it’s worth, the authors of the study have also created an interesting site, Recfluence.net where you can explore the recommendation path of various types of YouTube videos.

Permalink | Comments | Email This Story

Techdirt.

At least 18 officers fired their guns in UPS hijacking shootout, union official says – CNN

At least 18 officers fired their guns in UPS hijacking shootout, union official says  CNN
“HTTPS hijacking” – read more