Tag Archive for: algorithm

New algorithm helps BYU team put best face forward in security | Education


A group of students and professor Dr. D.J. Lee at BYU have come together to build an algorithm that could possibly bring two-factor authentication to facial recognition technologies in everything from cell phones to surveillance systems.

The project started almost two years ago as Lee and some students tried to think of an interesting research project. The group started looking into facial motion and how it could be analyzed.

That evolved into seeing if students are paying attention in class and it eventually morphed into improved security for facial recognition with the use of facial motion.

With the world of security constantly changing and hackers adapting to those changes, Lee acknowledged that nothing is perfect in terms of security.

“Fingerprinting is easy to do and people even make fake fingerprints,” Lee said. “The most common one is facial recognition and the biggest problem is, all of these can be used when the user is not aware. When you’re sleeping or unconscious, someone could use your biometrics to get into the system. It’s difficult, people come up with all kinds of ideas to hack into the system.”

He added that a company in Japan makes facial masks that look like people and some access social media pages to unlock devices needing facial recognition. Even algorithms can be fooled by photos and this technology can address the biggest concern, which is unintentional identity verification.

Two-factor authentication is not new technology, as companies like Apple and social media apps use it to verify someone’s identity, but integrating it into facial recognition is.

Lee said it is called Concurrent Two-Factor Identity Verification.

“Meaning you show your face and make the facial motion just once, you don’t have to do it twice,” Lee said. “With the facial motion, if people want to use your photo they cannot fool the system since the photo is not moving.”

The technology first uses facial recognition and then a secret phrase is mouthed, a movement with one’s lips is made, or a facial motion is made to satisfy the second step of authentication.

Even if a video is used, the chances of that video matching the secret facial…

Source…

New Study Suggests That YouTube’s Recommendation Algorithm Isn’t The Tool Of Radicalization Many People Believe (At Least Not Any More)

It’s become almost “common knowledge” that various social media recommendation engines “lead to radicalization.” Just recently in giving a talk to telecom execs, I was told, point blank, that social media was clearly evil and clearly driving people into radicalization because “that’s how you sell more ads” and that nothing I could say could convince them otherwise. Thankfully, though, there’s a new study that throws some cold water on those claims, by showing that YouTube’s algorithm — at least in late 2019 — appears to be doing the opposite.

To the contrary, these data suggest that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels….

Indeed, as you read through the report, it suggests that YouTube’s algorithm if it has any bias at all, it’s one towards bland centrism.

The recommendations algorithm advantages several groups to a significant extent. For example, we can see that when one watches a video that belongs to the Partisan Left category, the algorithm will present an estimated 3.4M impressions to the Center/Left MSM category more than it does the other way. On the contrary, we can see that the channels that suffer the most substantial disadvantages are again channels that fall outside mainstream media. Both right-wing and left-wing YouTuber channels are disadvantaged, with White Identitarian and Conspiracy channels being the least advantaged by the algorithm. For viewers of conspiracy channel videos, there are 5.5 million more recommendations to Partisan Right videos than vice versa.

We should also note that right-wing videos are not the only disadvantaged groups. Channels discussing topics such as social justice or socialist view are disadvantaged by the recommendations algorithm as well. The common feature of disadvantages channels is that their content creators are seldom broadcasting networks or mainstream journals. These channels are independent content creators.

Basically, YouTube is pushing people towards mainstream media sources. Whether or not you think that’s a good thing is up to you. But at the very least, it doesn’t appear to default to extremism as many people note. Of course, that doesn’t mean that it’s that way for everyone. Indeed, there are some people criticizing this study because it only studies non-logged in user recommendations. Nor does it mean that it wasn’t like that in the past. This study was done recently, and it’s been said that YouTube has been trying to adjust its algorithms quite a bit over the past few years in response to some of these criticisms.

However, this actually highlights some key points. Given enough public outcry, the big social media platforms have taken claims of “promoting extremism” seriously, and have taken efforts to deal with it (though, I’ll also make a side prediction that some aggrieved conspiracy theorists will try to use this as evidence of “anti-conservative bias” despite it not showing that at all). Companies are still figuring much of this stuff out and insisting that because of some anecdotes of radicalization that it must always be so, is obviously jumping the gun quite a bit.

In a separate Medium blog post by one of the authors of the paper, Mark Ledwich, it’s noted that the “these algorithms are radicalizing everyone” narrative also is grossly insulting to people’s ability to think for themselves:

Penn State political scientists Joseph Philips and Kevin Munger describe this as the “Zombie Bite” model of YouTube radicalization, which treats users who watch radical content as “infected,” and that this infection spreads. As they see it, the only reason this theory has any weight is that “it implies an obvious policy solution, one which is flattering to the journalists and academics studying the phenomenon.” Rather than look for faults in the algorithm, Philips and Munger propose a “supply and demand” model of YouTube radicalization. If there is a demand for radical right-wing or left-wing content, the demand will be met with supply, regardless of what the algorithm suggests. YouTube, with its low barrier to entry and reliance on video, provides radical political communities with the perfect platform to meet a pre-existing demand.

Writers in old media frequently misrepresent YouTube’s algorithm and fail to acknowledge that recommendations are only one of many factors determining what people watch and how they wrestle with the new information they consume.

Is it true that some people may have had their views changed over time by watching a bunch of gradually more extreme videos? Sure. How many people did that actually happen to? We have little evidence to show that it’s a lot. And, now, there is some real evidence suggesting that YouTube is less and less likely to push people in that direction if they’re among those who might be susceptible to such a thing in the first place.

For what it’s worth, the authors of the study have also created an interesting site, Recfluence.net where you can explore the recommendation path of various types of YouTube videos.

Permalink | Comments | Email This Story

Techdirt.

The NSA Designed The Algorithm That’s At The Core Of Internet Security – Business Insider


Business Insider

The NSA Designed The Algorithm That's At The Core Of Internet Security
Business Insider
The NSA is taking a public shellacking online over its domestic communication monitoring systems that recently were revealed by leaker Edward Snowden. The Internet furor over the NSA is somewhat ironic given that the NSA's SHA-1 cryptographic algorithm

“internet security” – read more

YouTube uses Amazon’s recommendation algorithm (Greg Linden/Geeking with Greg)

Greg Linden / Geeking with Greg:
YouTube uses Amazon’s recommendation algorithm  —  In a paper at the recent RecSys 2010 conference, “The YouTube Video Recommendation System” (ACM), eleven Googlers describe the system behind YouTube’s recommendations and personalization in detail.  —  The most interesting disclosure …

Read more