When Steve Jobs was running Apple, he was known to call journalists to either pat them on the back for a recent article or, more often than not, explain how they got it wrong. I was on the receiving end of a few of those calls. But nothing shocked me more than something Mr. Jobs said to me in late 2010 after he had finished chewing me out for something I had written about an iPad shortcoming.“So, your kids must love the iPad?” I asked Mr. Jobs, trying to change the subject. The company’s first tablet was just hitting the shelves. “They haven’t used it,” he told me. “We limit how much technology our kids use at home.”I’m sure I responded with a gasp and dumbfounded silence. I had imagined the Jobs’s household was like a nerd’s paradise: that the walls were giant touch screens, the dining table was made from tiles of iPads and that iPods were handed out to guests like chocolates on a pillow.Nope, Mr. Jobs told me, not even close.Since then, I’ve met a number of technology chief executives and venture capitalists who say similar things: they strictly limit their children’s screen time, often banning all gadgets on school nights, and allocating ascetic time limits on weekends.I was perplexed by this parenting style. After all, most parents seem to take the opposite approach, letting their children bathe in the glow of tablets, smartphones and computers, day and night.Yet these tech C.E.O.’s seem to know something that the rest of us don’t.Chris Anderson, the former editor of Wired and now chief executive of 3D Robotics, a drone maker, has instituted time limits and parental controls on every device in his home. “My kids accuse me and my wife of being fascists and overly concerned about tech, and they say that none of their friends have the same rules,” he said of his five children, 6 to 17. “That’s because we have seen the dangers of technology firsthand. I’ve seen it in myself, I don’t want to see that happen to my kids.”The dangers he is referring to include exposure to harmful content like pornography, bullying from other kids, and perhaps worse of all, becoming addicted to their devices, just like their parents.Alex Constantinople, the chief executive of the OutCast Agency, a tech-focused communications and marketing firm, said her youngest son, who is 5, is never allowed to use gadgets during the week, and her older children, 10 to 13, are allowed only 30 minutes a day on school nights.Evan Williams, a founder of Blogger, Twitter and Medium, and his wife, Sara Williams, said that in lieu of iPads, their two young boys have hundreds of books (yes, physical ones) that they can pick up and read anytime.So how do tech moms and dads determine the proper boundary for their children? In general, it is set by age.Children under 10 seem to be most susceptible to becoming addicted, so these parents draw the line at not allowing any gadgets during the week. On weekends, there are limits of 30 minutes to two hours on iPad and smartphone use. And 10- to 14-year-olds are allowed to use computers on school nights, but only for homework.“We have a strict no screen time during the week rule for our kids,” said Lesley Gold, founder and chief executive of the SutherlandGold Group, a tech media relations and analytics company. “But you have to make allowances as they get older and need a computer for school.”
Some parents also forbid teenagers from using social networks, except for services like Snapchat, which deletes messages after they have been sent. This way they don’t have to worry about saying something online that will haunt them later in life, one executive told me.Although some non-tech parents I know give smartphones to children as young as 8, many who work in tech wait until their child is 14. While these teenagers can make calls and text, they are not given a data plan until 16. But there is one rule that is universal among the tech parents I polled.While some tech parents assign limits based on time, others are much stricter about what their children are allowed to do with screens.Ali Partovi, a founder of iLike and adviser to Facebook, Dropbox and Zappos, said there should be a strong distinction between time spent “consuming,” like watching YouTube or playing video games, and time spent “creating” on screens.“Just as I wouldn’t dream of limiting how much time a kid can spend with her paintbrushes, or playing her piano, or writing, I think it’s absurd to limit her time spent creating computer art, editing video, or computer programming,” he said.Others said that outright bans could backfire and create a digital monster.Dick Costolo, chief executive of Twitter, told me he and his wife approved of unlimited gadget use as long as their two teenage children were in the living room. They believe that too many time limits could have adverse effects on their children.“When I was at the University of Michigan, there was this guy who lived in the dorm next to me and he had cases and cases of Coca-Cola and other sodas in his room,” Mr. Costolo said. “I later found out that it was because his parents had never let him have soda when he was growing up. If you don’t let your kids have some exposure to this stuff, what problems does it cause later?”I never asked Mr. Jobs what his children did instead of using the gadgets he built, so I reached out to Walter Isaacson, the author of “Steve Jobs,” who spent a lot of time at their home.“Every evening Steve made a point of having dinner at the big long table in their kitchen, discussing books and history and a variety of things,” he said. “No one ever pulled out an iPad or computer. The kids did not seem addicted at all to devices.”
In fact, hundreds of European scientists working on the project arethreatening a boycott because
of this direction. In their view, the initial directive was to be more
focused on repairing organic injuries and disorders such as Parkinson’s,
Alzheimer’s and physical brain damage sustained in accidents. Post
Traumatic Stress Disorder would be one area that might involve the military.
However, there is a disturbing trend developing in law enforcement and medicine to use what has been learned about the human brain in order to adopt pre-crime systems and predictive behavior technology.
But could a brain scan become standard procedure to see which troops might be inclined to commit insider attacks?
Troops overseas have been working alongside Iraqi and Afghan troops for years, but a new interest is being taken in evaluating potential extremists who are infiltrating to kill from within.
The numbers of these incidents are statistically low as reported by Defense One, which cites the inside killing of “several troops in recent years.” But a former Army counterintelligence agent sees the opportunity to apply new technology that presumably can screen people for mal-intent. The system is called HandShake:
It’s obviously ironic that this system is intended to be used on people who never should have encountered the U.S. military in the first place, since the U.S. military arrived based on lies. Moreover, to those flagged by such a system, they are clearly open to being tortured under the policies that have been established in the War on Terror world in which we live.
This system comes at an expense in excess of $1 million dollars to deploy and $500,000 per month thereafter, per site, according to the company’s founder. Both the monetary cost and the ethical costs should ensure that this technology never sees the light of day. However, the military-industrial complex has a provable track record of caring very little about either.
Note: The article linked below demonstrates how the biometric identification system in Afghanistan already has trickled down to the streets of America. If brain scanning technology is successful overseas, it is guaranteed to show up inside the United States. It’s already been proposed for air travel and other applications under theFAST system (Future Attribute Screening Technology). Additionally, with the increased war on whistleblowers, this would be a wonderful tool for employers to weed out those whose desire is not to undermine, but simply to expose criminality.
However, there is a disturbing trend developing in law enforcement and medicine to use what has been learned about the human brain in order to adopt pre-crime systems and predictive behavior technology.
But could a brain scan become standard procedure to see which troops might be inclined to commit insider attacks?
Troops overseas have been working alongside Iraqi and Afghan troops for years, but a new interest is being taken in evaluating potential extremists who are infiltrating to kill from within.
The numbers of these incidents are statistically low as reported by Defense One, which cites the inside killing of “several troops in recent years.” But a former Army counterintelligence agent sees the opportunity to apply new technology that presumably can screen people for mal-intent. The system is called HandShake:
Here’s how the HandShake system works: A U.S. soldier would take, say, an Iraqi officer and outfit the subject with a special helmet that can pick up both electromagnetic signals (EEG) and perform functional near-infrared imaging (fNIRs) which images blood flow changes in the brain. The soldier would put the subject through a battery of tests including image recognition. Most of the pictures in the tests would be benign, but a few would contain scenes that a potential insider threat would remember, possibly including faces, locations or even bomb parts. The key is to select these images very, very carefully to cut down on the potential false positives.[...]
When you recognize a picture that’s of emotional significanceto you, your brain experiences a 200 to 500 microsecond hiccup, during which the electromagnetic activity drops, measurable via EEG. The reaction, referred to as the P300 response, happens too fast for the test subject to control, so the subject can’t game the system.The company, Veritas, has issued the following video promo for their system:
The fNIR readings back up the EEG numbers. Together, they speak to not only whether or not a subject is a traitor but howlikely an individual is to act on potentially criminal or treasonous impulses. The system then runs all the data through what Veritas calls a Friend or Foe Algorithm. The output: the ability to pinpoint an insider’s threat potential with 80 to 90 percent accuracy, according to the company. (Source) [emphasis added]
It’s obviously ironic that this system is intended to be used on people who never should have encountered the U.S. military in the first place, since the U.S. military arrived based on lies. Moreover, to those flagged by such a system, they are clearly open to being tortured under the policies that have been established in the War on Terror world in which we live.
This system comes at an expense in excess of $1 million dollars to deploy and $500,000 per month thereafter, per site, according to the company’s founder. Both the monetary cost and the ethical costs should ensure that this technology never sees the light of day. However, the military-industrial complex has a provable track record of caring very little about either.
Note: The article linked below demonstrates how the biometric identification system in Afghanistan already has trickled down to the streets of America. If brain scanning technology is successful overseas, it is guaranteed to show up inside the United States. It’s already been proposed for air travel and other applications under theFAST system (Future Attribute Screening Technology). Additionally, with the increased war on whistleblowers, this would be a wonderful tool for employers to weed out those whose desire is not to undermine, but simply to expose criminality.