Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Software’ Category

The future was conceived by Philip K. Dick: In your face China’s all seeing surveilance system

leave a comment »

The video below is a shortened version of the video in this BBC article.

Written by LeisureGuy

17 December 2017 at 8:07 am

Ahead of their time: The history of PLATO

leave a comment »

From the Amazon description:

At a time when Steve Jobs was only a teenager and Mark Zuckerberg wasn’t even born, a group of visionary engineers and designers—some of them only high school students—in the late 1960s and 1970s created a computer system called PLATO, which was light-years ahead in experimenting with how people would learn, engage, communicate, and play through connected computers. Not only did PLATO engineers make significant hardware breakthroughs with plasma displays and touch screens but PLATO programmers also came up with a long list of software innovations: chat rooms, instant messaging, message boards, screen savers, multiplayer games, online newspapers, interactive fiction, and emoticons. Together, the PLATO community pioneered what we now collectively engage in as cyberculture. They were among the first to identify and also realize the potential and scope of the social interconnectivity of computers, well before the creation of the internet. PLATO was the foundational model for every online community that was to follow in its footsteps.

The Friendly Orange Glow is the first history to recount in fascinating detail the remarkable accomplishments and inspiring personal stories of the PLATO community. The addictive nature of PLATO both ruined many a college career and launched pathbreaking multimillion-dollar software products. Its development, impact, and eventual disappearance provides an instructive case study of technological innovation and disruption, project management, and missed opportunities. Above all, The Friendly Orange Glow at last reveals new perspectives on the origins of social computing and our internet-infatuated world.

Written by LeisureGuy

10 December 2017 at 11:09 am

Posted in Books, Software, Technology

Facebook Allowed Political Ads That Were Actually Scams and Malware

leave a comment »

Jennifer Valentino-DeVries, Jeff Larson, and Julia Angwin report in ProPublica:

In September, an ad with the headline, “New Approval Ratings For President Trump Announced And It’s Not Going The Way You Think,” targeted Facebook users in the U.S. who were over 40 and labeled as “very liberal” by the tech company.

“Regardless of what you think of Donald Trump and his policies, it’s fair to say that his appointment as President of the United States is one of the most…,” ran the text. “Learn more.”

At least some people who clicked on this come-on found their computers frozen. Their screens displayed a warning and a computer-generated voice informed them that their machine had been “infected with viruses, spywares and pornwares,” and that their credit card information and other personal data had been stolen — and offered a phone number to call to fix it.

Actually, the freeze was temporary, and restarting the computer would have unlocked it. But worried users who called the number would have been asked to pay to restore their access, according to computer security experts who have tracked the scam for more than a year.

Russian disinformation isn’t the only deceptive political advertising on Facebook. The pitch designed to lure President Donald Trump’s critics is one of more than a dozen politically themed advertisements masking consumer rip-offs that ProPublica has identified since launching an effort in September to monitor paid political messages on the world’s largest social network. As the American public becomes ever more polarized along partisan lines, swindlers who used to capitalize on curiosity about celebrities or sports are now exploiting political passions.

“Those political ads, especially right now if you look at the U.S., they are actually getting more clicks,” said Jérôme Segura, lead malware intelligence analyst at anti-malware company Malwarebytes. “Where there are clicks, there is going to be interest from bad guys.”

The ads, supplied by ProPublica readers through our Political Ad Collector tool, lured Facebook viewers with provocative statements about hot-button figures such as former President Barack Obama, Ivanka Trump, Fox News commentator Sean Hannity and presidential adviser Kellyanne Conway.

Clicking on the headline, “Sponsors Pull out From His Show Over This?” — over a photo of Hannity with MSNBC commentator Rachel Maddow — led to a page styled to look like the Fox News website. It offered a free bottle of Testo-Max HD, which it described as a cure for erectile dysfunction, although it isn’t approved by the FDA. People who sign up for such free nostrums are typically asked to provide credit card information to pay for shipping and are then automatically charged almost $100 a month, according to reviews online.

Although these scams represent only a tiny fraction of the more than 8,000 politically themed advertisements assembled by the Political Ad Collector, they raise doubts about Facebook’s ability to monitor paid political messages. In each case, the ads ran afoul of guidelines Facebook has developed to curb misleading and malicious advertising. Many of the scams had also been flagged by users, fact-checking groups and cybersecurity services — even the Federal Trade Commission — long before they appeared on the social network.

Moreover, most of the sites may have warranted special attention because they had been registered within the 30 days before users sent them to our Political Ad Collector. Paul Vixie, the co-founder of San Mateo, California-based computer security company Farsight Security, said new website domains are more likely to be shady, because fraudsters often shut sites down after days or even minutes and open new ones to stay ahead of authorities looking to catch them.

As the midterm elections heat up, such cons are likely to proliferate, along with more devious forms of information warfare. Facebook Chief Operating Officer Sheryl Sandberg recently said in an interview with Axios that the social network had missed “more subtle” election interference in part because its security team had been focused on “the biggest threats” of malware and phishing — tricking people into revealing their personal information. Based on ProPublica’s findings, it’s unclear if the world’s largest social network can handle either challenge.

Facebook officials told ProPublica that the company is trying to improve its ability to stop harmful advertising, including malware and frauds, but is aware some bad ads get through its defenses. “There is no tolerable amount of malware on the site. The tolerance is zero, but unfortunately that’s not the same as zero occurrence,” said Rob Goldman, Facebook’s vice president of ads. Goldman said of the 14 deceptive ads ProPublica identified, 12 were removed by Facebook before ProPublica contacted the company in November. Facebook took down the other two after ProPublica alerted it to the ads. . .

Continue reading.

Written by LeisureGuy

5 December 2017 at 3:28 pm

Facebook (Still) Letting Housing Advertisers Exclude Users by Race

leave a comment »

Full disclosure: I have a Facebook problem of my own. Someone has been sending messages in my name (as if they came from me) to some whom I’ve friended. I have posted about this problem and have received some replies to the effect, “Aha. Now I understand those strange messages.” The interesting thing I discovered is that there is no way to report this problem to Facebook, and I feel sure that is because Facebook doesn’t care: their total focus is on harvesting data from their users and selling it on.

For example, see “We Can’t Trust Facebook to Regulate Itself,” by Sandy Parakilas, in the NY Times, which begins:

I led Facebook’s efforts to fix privacy problems on its developer platform in advance of its 2012 initial public offering. What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse. As the world contemplates what to do about Facebook in the wake of its role in Russia’s election meddling, it must consider this history. Lawmakers shouldn’t allow Facebook to regulate itself. Because it won’t.

Facebook knows what you look like, your location, who your friends are, your interests, if you’re in a relationship or not, and what other pages you look at on the web. This data allows advertisers to target the more than one billion Facebook visitors a day. It’s no wonder the company has ballooned in size to a $500 billion behemoth in the five years since its I.P.O.

The more data it has on offer, the more value it creates for advertisers. That means it has no incentive to police the collection or use of that data — except when negative press or regulators are involved. Facebook is free to do almost whatever it wants with your personal information, and has no reason to put safeguards in place.

For a few years, Facebook’s developer platform hosted a thriving ecosystem of popular social games. Remember the age of Farmville and Candy Crush? The premise was simple: Users agreed to give game developers access to their data in exchange for free use of addictive games.

Unfortunately for the users of these games, there were no protections around the data they were passed through Facebook to outside developers. Once data went to the developer of a game, there was not much Facebook could do about misuse except to call the developer in question and threaten to cut off the developer’s access. As the I.P.O. approached, and the media reported on allegations of misuse of data, I, as manager of the team responsible for protecting users on the developer platform from abuse of their data, was given the task of solving the problem.

In one instance, a developer appeared to be using Facebook data to automatically generate profiles of children, without their consent. When I called the company responsible for the app, it claimed that Facebook’s policies on data use were not being violated, but we had no way to confirm whether that was true. Once data passed from the platform to a developer, Facebook had no view of the data or control over it. In other cases, developers asked for permission to get user data that their apps obviously didn’t need — such as a social game asking for all of your photos and messages. People rarely read permissions request forms carefully, so they often authorize access to sensitive information without realizing it.

At a company that was deeply concerned about protecting its users, this situation would have been met with a robust effort to cut off developers who were making questionable use of data. But when I was at Facebook, the typical reaction I recall looked like this: try to put any negative press coverage to bed as quickly as possible, with no sincere efforts to put safeguards in place or to identify and stop abusive developers. When I proposed a deeper audit of developers’ use of Facebook’s data, one executive asked me, “Do you really want to see what you’ll find?”

The message was clear: The company just wanted negative stories to stop. It didn’t really care how the data was used. . .

Continue reading.

Now Julia Angwin, Ariana Tobin and Madeleine Varner report in ProPublica:

In February, Facebook said it would step up enforcement of its prohibition against discrimination in advertising for housing, employment or credit.

But our tests showed a significant lapse in the company’s monitoring of the rental market.

Last week, ProPublica bought dozens of rental housing ads on Facebook, but asked that they not be shown to certain categories of users, such as African Americansmothers of high school kids, people interested in wheelchair rampsJewsexpats from Argentina and Spanish speakers.

All of these groups are protected under the federal Fair Housing Act, which makes it illegal to publish any advertisement “with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” Violators can face tens of thousands of dollars in fines.

Every single ad was approved within minutes.

The only ad that took longer than three minutes to be approved by Facebook sought to exclude potential renters “interested in Islam, Sunni Islam and Shia Islam.” It was approved after 22 minutes.

Under its own policies, Facebook should have flagged these ads, and prevented the posting of some of them. Its failure to do so revives questions about whether the company is in compliance with federal fair housing rules, as well as about its ability and commitment to police discriminatory advertising on the world’s largest social network.

Housing, employment and credit are the three areas in which federal law prohibits discriminatory ads. However, the U.S. Department of Housing and Urban Development — the agency responsible for enforcing fair housing laws — told us that it has closed an inquiry into Facebook’s advertising policies, reducing pressure on the company to address the issue. In a 2015 newspaper column, Ben Carson, now HUD secretary, criticized “government-engineered attempts to legislate racial equality” in housing.

Facebook’s failure to police discriminatory rental ads flies in the face of its promises in February that it would no longer approve ads for housing, employment or credit that targeted racial categories. For advertising aimed at audiences not selected by race, Facebook said it would require housing, employment and credit advertisers to “self-certify” that their ads were compliant with anti-discrimination laws.

Based on Facebook’s announcement, the ads purchased by ProPublica that were aimed at racial categories should have been rejected. The others should have prompted a screen to pop up asking for self-certification. We never encountered a self-certification screen, and none of our ads were rejected by Facebook.

“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments,” Ami Vora, vice president of product management at Facebook, said in an emailed statement. “The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.” . . .

Continue reading.

I simply do not believe for a second Facebook’s protestations. What is needed is independent oversight and regulation with severe (criminal) penalties for violations. Self-policing never works, which is why organizations always propose it.

 

Written by LeisureGuy

21 November 2017 at 11:39 am

The Serial-Killer Detector

leave a comment »

Alec Wilkinson writes in the New Yorker:

Thomas Hargrove is a homicide archivist. For the past seven years, he has been collecting municipal records of murders, and he now has the largest catalogue of killings in the country—751,785 murders carried out since 1976, which is roughly twenty-seven thousand more than appear in F.B.I. files. States are supposed to report murders to the Department of Justice, but some report inaccurately, or fail to report altogether, and Hargrove has sued some of these states to obtain their records. Using computer code he wrote, he searches his archive for statistical anomalies among the more ordinary murders resulting from lovers’ triangles, gang fights, robberies, or brawls. Each year, about five thousand people kill someone and don’t get caught, and a percentage of these men and women have undoubtedly killed more than once. Hargrove intends to find them with his code, which he sometimes calls a serial-killer detector.

Hargrove created the code, which operates as a simple algorithm, in 2010, when he was a reporter for the now defunct Scripps Howard news service. The algorithm forms the basis of the Murder Accountability Project (map), a nonprofit that consists of Hargrove—who is retired—a database, a Web site, and a board of nine members, who include former detectives, homicide scholars, and a forensic psychiatrist. By a process of data aggregating, the algorithm gathers killings that are related by method, place, and time, and by the victim’s sex. It also considers whether the rate of unsolved murders in a city is notable, since an uncaught serial killer upends a police department’s percentages. Statistically, a town with a serial killer in its midst looks lawless.

In August of 2010, Hargrove noticed a pattern of murders in Lake County, Indiana, which includes the city of Gary. Between 1980 and 2008, fifteen women had been strangled. Many of the bodies had been found in vacant houses. Hargrove wrote to the Gary police, describing the murders and including a spreadsheet of their circumstances. “Could these cases reflect the activity of one or more serial killers in your area?” he asked.

The police department rebuffed him; a lieutenant replied that there were no unsolved serial killings in Gary. (The Department of Justice advises police departments to tell citizens when a serial killer is at large, but some places keep the information secret.) Hargrove was indignant. “I left messages for months,” he said. “I sent registered letters to the chief of police and the mayor.” Eventually, he heard from a deputy coroner, who had also started to suspect that there was a serial killer in Gary. She had tried to speak with the police, but they had refused her. After reviewing Hargrove’s cases, she added three more victims to his list.

Four years later, the police in Hammond, a town next to Gary, got a call about a disturbance at a Motel 6, where they found a dead woman in a bathtub. Her name was Afrikka Hardy, and she was nineteen years old. “They make an arrest of a guy named Darren Vann, and, as so often happens in these cases, he says, ‘You got me,’ ” Hargrove said. “Over several days, he takes police to abandoned buildings where they recover the bodies of six women, all of them strangled, just like the pattern we were seeing in the algorithm.” Vann had killed his first woman in the early nineties. In 2009, he went to jail for rape, and the killings stopped. When he got out, in 2013, Hargrove said, “he picked up where he’d left off.”

Researchers study serial killers as if they were specimens of natural history. One of the most comprehensive catalogues is the Radford Serial Killer Data Base, which has nearly five thousand entries from around the world—the bulk of them from the United States—and was started twenty-five years ago by Michael Aamodt, a professor emeritus at Radford University, in Virginia. According to the database, American serial killers are ten times more likely to be male than female. Ray Copeland, who was seventy-five when he was arrested, killed at least five drifters on his farm in Missouri late in the last century, and is the oldest serial killer in the database. The youngest is Robert Dale Segee, who grew up in Portland, Maine, and, in 1938, at the age of eight, is thought to have killed a girl with a rock. Segee’s father often punished him by holding his fingers over a candle flame, and Segee became an arsonist. After starting a fire, he sometimes saw visions of a crimson man with fangs and claws, and flames coming out of his head. In June of 1944, when Segee was fourteen, he got a job with the Ringling Brothers circus. The next month, the circus tent caught fire, and a hundred and sixty-eight people were killed. In 1950, after being arrested for a different fire, Segee confessed to setting the tent ablaze, but years later he withdrew his confession, saying that he had been mad when he made it.

Serial killers are not usually particularly bright, having an average I.Q. of 94.5, according to the database. They divide into types. Those who feel bound to rid the world of people they regard as immoral or undesirable—such as drug addicts, immigrants, or promiscuous women—are called missionaries. Black widows kill men, usually to inherit money or to claim insurance; bluebeards kill women, either for money or as an assertion of power. A nurse who kills patients is called an angel of death. A troller meets a victim by chance, and a trapper either observes his victims or works at a place, such as a hospital, where his victims come to him.

The F.B.I. believes that less than one per cent of the killings each year are carried out by serial killers, but Hargrove thinks that the percentage is higher, and that there are probably around two thousand serial killers at large in the U.S. “How do I know?” he said. “A few years ago, I got some people at the F.B.I. to run the question of how many murders in their records are unsolved but have been linked through DNA.” The answer was about fourteen hundred, slightly more than two per cent of the murders in the files they consulted. “Those are just the cases they were able to lock down with DNA,” Hargrove said. “And killers don’t always leave DNA—it’s a gift when you get it. So two per cent is a floor, not a ceiling.”

Hargrove is sixty-one. He is tall and slender, with a white beard and a skeptical regard. He lives with his wife and son in Alexandria, Virginia, and walks eight miles a day, to Mount Vernon or along the Potomac, while listening to recordings of books—usually mystery novels. He was born in Manhattan, but his parents moved to Yorktown, in Westchester County, when he was a boy. “I lived near Riverside Drive until I was four,” he said. “Then one day I showed my mom what I learned on the playground, which is that you can make a switchblade out of Popsicle sticks, and next thing I knew I was living in Yorktown.”

Hargrove’s father wrote technical manuals on how to use mechanical calculators, and when Hargrove went to college, at the University of Missouri, he studied computational journalism and public opinion. He learned practices such as random-digit-dialling theory, which is used to conduct polls, and he was influenced by “Precision Journalism,” a book by Philip Meyer that encourages journalists to learn survey methods from social science. After graduating, in 1977, he was hired by the Birmingham Post-Herald, in Alabama, with the understanding that he would conduct polls and do whatever else the paper needed. As it turned out, the paper needed a crime reporter. In 1978, Hargrove saw his first man die, the owner of a convenience store who had been shot during a robbery. He reported on a riot that began after police officers shot a sixteen-year-old African-American girl. Once, arriving at a standoff, he was shot at with a rifle by a drunk on a water tower. The bullet hit the gravel near his feet and made a sound that “was not quite a plink.” He also covered the execution of a man named John Lewis Evans, the first inmate put to death in Alabama after a Supreme Court abrogation of capital punishment in the nineteen-sixties and seventies. “They electrocuted people in Alabama in an electric chair called the Yellow Mama, because it was painted bright yellow,” Hargrove said. “Enough time had passed since the last execution that no one remembered how to do it. The first time, too much current went through too small a conduit, so everything caught fire. Everyone was crying, and I had trouble sleeping for days after.”

In 1990, Hargrove moved to Washington, D.C., to work for Scripps Howard, where, he said, “my primary purpose was to use numbers to shock people.” Studying the Social Security Administration’s Death Master File—“where we will all end up one day,” Hargrove said—he noticed that some people were included for a given year and dropped a few years later: people who had mistakenly been declared dead. From interviews, he learned that these people often have their bank accounts suddenly frozen, can’t get credit cards or mortgages, and are refused jobs because they fail background checks. Comparing a list of federal grants for at-risk kids in inner-city schools against Census Bureau Zip Codes, he found that two-thirds of the grants were actually going to schools in the suburbs. “He did all this through really clever logic and programming,” Isaac Wolf, a former journalist who had a desk near Hargrove’s, told me. “A combination of resourceful thinking and an innovative approach to collecting and analyzing data through shoe-leather work.”

In 2004, Hargrove was assigned a story about prostitution. To learn which cities enforced laws against the practice and which didn’t, he requested a copy of the Uniform Crime Report, an annual compilation published by the F.B.I., and received a CD containing the most recent report, from 2002. “Along with it, at no extra cost, was something that said ‘S.H.R. 2002,’ ” he said. It was the F.B.I.’s Supplementary Homicide Report, which includes all the murders reported to the Bureau, listing the age, race, sex, and ethnicity of the victim, along with the method and circumstances of the killing. As Hargrove looked through it, “the first thing I thought was, I wonder if it’s possible to teach a computer to spot serial victims.” Hargrove said that for six years he told each of his editors at Scripps Howard that he wanted to find serial killers using a computer, and the response was always, “You’re kidding, right?”

In 2007, Hargrove did an investigation into sids, Sudden Infant Death Syndrome, after wondering why, according to the Centers for Disease Control’s infant-mortality records, so many more babies in Florida died from accidental suffocation than did babies in California, even though California had many more babies. . .

Continue reading.

Written by LeisureGuy

20 November 2017 at 2:20 pm

A completely rebuilt Firefox browser: Firefox Quantum

leave a comment »

I use Firefox to write my blog posts, and it just updated itself to the newest version, Firefox Quantum. This version looks damn good. You can read about it (and download it) on this page.

Written by LeisureGuy

14 November 2017 at 1:21 pm

Posted in Software, Technology

Easy video intro to neural networks

leave a comment »

Part 1:

Part 2:

And also:

Written by LeisureGuy

31 October 2017 at 9:30 am

%d bloggers like this: