“What do Uber, Volkswagen and Zenefits have in common? They all used hidden code to break the law.”

“What do Uber, Volkswagen and Zenefits have in common? They all used hidden code to break the law.” @ossia https://medium.freecodecamp.com/dark-genius-how-programmers-at-uber-volkswagen-and-zenefits-helped-their-employers-break-the-law-b7a7939c6591

The untold stories of women who moved the world forward – The Verge

http://www.theverge.com/2017/3/8/14850102/day-without-a-woman-cars-transportation-racing-gm-mercedes

— excerpt below —

The untold stories of women who moved the world forward

9

A day of silence to honor pioneers who made noise

Photo: General Motors

Last month I saw Gloria Steinem and Octavia Spencer speak on a panel about the film Hidden Figures at the Makers Conference. An audience member posited the question, “How do we find the other hidden figures in history?”

“We have to be tenacious,” Spencer responded. “If you don’t know the story, how can you seek it out? First we have to ask questions and we have to acknowledge every person on a team. Women couldn’t put their name on reports and men took the credit for all their work, I mean come on.”

 Photo: General Motors

That got me thinking. What have I been missing? I am a woman who writes about transportation, often looking forward trying to measure disparities that still exist, but not always spending enough time looking back to understand how we arrived here.

In virtually every aspect of industrial innovation, women have played an essential part of forming that history. When the women were left out of the decision making, it was never for a lack of interest, but rather for lack of opportunity. And when women did do something significant, it often took many years for their contributions to be acknowledged, if at all. When commended, their achievements were heralded as something noteworthy because it wasn’t deemed normal for women to participate in the process of progress. 

The rise of the automobile coincided with the rise of the struggle for women’s rights. In 1914, French-born Dorothée Pullinger tried to join the Institution of Automobile Engineers, but was denied entry because she was a woman. She persisted and was finally granted access in 1920, the year American women gained the right to vote. She later oversaw production at the Galloway Motor Car Company in Scotland, and moonlighted as a race car driver as well. In 1921, the first African-American pilot Bessie Colemanreceived her flying license from the Fédération Aéronautique Internationale.

Almost from the beginning women drivers made their mark on society. Bertha Benz, the wife of Mercedes-Benz founder Karl, took the first cross country road trip in Germany in 1888, but only recently has been celebrated for her contributions. In the summer of 1909, Alice Ramsay and three other women traveled from New York City to San Francisco in a Maxwell, a journey she wrote about in the 1961 book Veil, Duster and Tire Iron. She became the first woman inducted into the Automotive Hall of Fame in the year 2000, 91 years after the fact. That’s a long time to wait for props.

Here and there women show up in the transportation history books. The most notable was Harriet Tubman, who liberated over 300 people by navigating the Underground Railroad. Marta Coston was issued a patent for development of the telegraphic night signals in 1859 for maritime use. Mary Walton was issued patents for her work on railroads reducing noise pollution in the 1880s and Olive Dennis contributed to the development of B&O railroads as an engineer. Some women are mentioned for their work at burgeoning car companies throughout the 20th century. Automobile Magazine reported Betty Thatcher Oros worked as a Hudson designer in the 1930s, and Helene Rother became the first female designer at GM in 1943. Audrey Moore Hodges worked at both Studebaker and Tucker in the 1940s. In 1937, Willa Brown became the first African American commercial pilot.

In the grand scheme of things, these women’s contributions are significant, but are overshadowed by their male colleagues, and the stereotypes, stigma, and barriers that kept them from going far in big numbers. But what about the others, women who made things, pushed boundaries, and innovated who we are still unknown? I am certain they existed, but they have have sailed under the radar, or like the women portrayed in Hidden Figures, been carelessly or deliberately left out of the stories. For every Mary Barra and Amelia Earhart, there are many more Jane Does.

Shirley MuldowneyPhoto: Getty Images

In the post-war car boom, women became a driving force in the marketplace. Some male executives and marketers, eager to sell cars to women, began to experiment with different ways to appeal to female customers. Outside of automotive enthusiast circles, it’s a little known fact that GM hired a group of women designers from the Pratt Institute to work in the GM design studios in the late 1950s. The design chief Harley Earl called them the Damsels of Design, a terrible name, but one that shouldn’t take away from the show cars they developed for the— wait for it— Feminine Auto Show held in 1958. But it’s been reported that the designers had strict limits on what they could touch in the car interior; the instrument panel was off limits. But despite restrictions, some of their innovations were pioneering such as light-up mirrors, glove compartments, and child proof doors.

 Photo: General Motors

Faster, first, speed, aggression: When I think about the metaphors of progress, motorsports is among the more profound. Women’s impact on racing is an extraordinary achievement considering women were banned from racing in various organizations and then were relegated to special races for women only. But some women felt that racing wasn’t a gendered pursuit. I met one of these extraordinary women, Denise McCluggage, a journalist who went head to head with motorsports giants and continued to document the industry well into her 80s. Other stand outs include Janet Guthrie, the first woman to compete in the Indy 500 in 1977 and Shirley Muldowney, the first woman to receive a drag racing license and the inspiration for the film Heart Like A Wheel, the L7 song “Shirley,” and the Le Tigre Song “Hot Topic.” In the mid 1970s “Nitro Nellie” Goinsbroke barriers as an African-American woman drag racer, who was inducted into the East Coast Drag Times Hall of Fame in 2014.

Once these stories of women who worked in all aspects of the field are unearthed, their capacity to inspire is profound. These stories defy what we’ve been taught. At that Makers talk, Gloria Steinem also said, “We still do not know history. It’s still a political history that we are learning.”

For every high profile trailblazer, there’s the behind-the-scenes woman whose story is waiting to be discovered, in the foreground of a photo, in the fine print, or in a tiny smudged corner of the ledger. What I do know is that I’m grateful to all of them, because in some way each made it a little bit more easy to for me navigate this strange space in the car industry where women are still widely underrepresented.

 Photo: General Motors

So in honor of all the women whose stories were silenced in transportation, science, arts and culture, and the technology we cover at The Verge, today I strike in solidarity. Finding my way here was an adventure, but it’s nothing compared to what came before me, when women who spoke out had to watch their backs. Tomorrow is a new day in the massive work ahead of looking, listening, and unearthing the truth that will help us remember how important it is to fight for our place in the future.

Vox Media, Inc. All Rights Reserved

Jeff Kirschner: This app makes it fun to pick up litter | TED Talk | TED.com

See also : Literatti.org
–video transcript below —

This story starts with these two — my kids. We were hiking in the Oakland woods when my daughter noticed a plastic tub of cat litter in a creek. She looked at me and said, “Daddy? That doesn’t go there.”
0:28

When she said that, it reminded me of summer camp. On the morning of visiting day, right before they’d let our anxious parents come barreling through the gates, our camp director would say, “Quick! Everyone pick up five pieces of litter.” You get a couple hundred kids each picking up five pieces, and pretty soon, you’ve got a much cleaner camp. So I thought, why not apply that crowdsourced cleanup model to the entire planet? And that was the inspiration for Litterati.
0:54

The vision is to create a litter-free world. Let me show you how it started. I took a picture of a cigarette using Instagram. Then I took another photo … and another photo … and another photo. And I noticed two things: one, litter became artistic and approachable; and two, at the end of a few days, I had 50 photos on my phone and I had picked up each piece, and I realized that I was keeping a record of the positive impact I was having on the planet. That’s 50 less things that you might see, or you might step on, or some bird might eat.
1:29

So I started telling people what I was doing, and they started participating. One day, this photo showed up from China. And that’s when I realized that Litterati was more than just pretty pictures; we were becoming a community that was collecting data. Each photo tells a story. It tells us who picked up what, a geotag tells us where and a time stamp tells us when. So I built a Google map, and started plotting the points where pieces were being picked up. And through that process, the community grew and the data grew. My two kids go to school right in that bullseye.
2:16

Litter: it’s blending into the background of our lives. But what if we brought it to the forefront? What if we understood exactly what was on our streets, our sidewalks and our school yards? How might we use that data to make a difference?
2:32

Well, let me show you. The first is with cities. San Francisco wanted to understand what percentage of litter was cigarettes. Why? To create a tax. So they put a couple of people in the streets with pencils and clipboards, who walked around collecting information which led to a 20-cent tax on all cigarette sales. And then they got sued by big tobacco, who claimed that collecting information with pencils and clipboards is neither precise nor provable. The city called me and asked if our technology could help. I’m not sure they realized that our technology was my Instagram account —
3:09

(Laughter)
3:10

But I said, “Yes, we can.”
3:12

(Laughter)
3:13

“And we can tell you if that’s a Parliament or a Pall Mall. Plus, every photograph is geotagged and time-stamped, providing you with proof.” Four days and 5,000 pieces later, our data was used in court to not only defend but double the tax, generating an annual recurring revenue of four million dollars for San Francisco to clean itself up.
3:39

Now, during that process I learned two things: one, Instagram is not the right tool —
3:43

(Laughter)
3:44

so we built an app.
3:46

And two, if you think about it, every city in the world has a unique litter fingerprint, and that fingerprint provides both the source of the problem and the path to the solution. If you could generate a revenue stream just by understanding the percentage of cigarettes, well, what about coffee cups or soda cans or plastic bottles? If you could fingerprint San Francisco, well, how about Oakland or Amsterdam or somewhere much closer to home? And what about brands? How might they use this data to align their environmental and economic interests?
4:26

There’s a block in downtown Oakland that’s covered in blight. The Litterati community got together and picked up 1,500 pieces. And here’s what we learned: most of that litter came from a very well-known taco brand. Most of that brand’s litter were their own hot sauce packets, and most of those hot sauce packets hadn’t even been opened. The problem and the path to the solution — well, maybe that brand only gives out hot sauce upon request or installs bulk dispensers or comes up with more sustainable packaging. How does a brand take an environmental hazard, turn it into an economic engine and become an industry hero?
5:10

If you really want to create change, there’s no better place to start than with our kids. A group of fifth graders picked up 1,247 pieces of litter just on their school yard. And they learned that the most common type of litter were the plastic straw wrappers from their own cafeteria. So these kids went to their principal and asked, “Why are we still buying straws?” And they stopped. And they learned that individually they could each make a difference, but together they created an impact.
5:40

It doesn’t matter if you’re a student or a scientist, whether you live in Honolulu or Hanoi, this is a community for everyone. It started because of two little kids in the Northern California woods, and today it’s spread across the world. And you know how we’re getting there? One piece at a time.
6:03

Thank you.
6:04

(Applause)

We No Longer Have Three Branches of Government – POLITICO Magazine


POLITICO Magazine

THE BIG IDEA

We No Longer Have Three Branches of Government

I served in Congress for 16 years and taught civics for 13 more. Our government no longer looks like the one I told my students about—or the one the Constitution describes.

 

February 27, 2017

For more than a dozen years, teaching government classes to graduate students at Harvard and Princeton, I filled my students’ heads with facts that no longer seem to be true. They have become “alternate facts,” or perhaps just outdated ones.

It has been my habit to begin each semester by slowly taking students through the Constitution, each article and section in turn, emphasizing not only each provision but why it was included. Fundamental to the constitutional process, I taught, was the unique delineation of authority and responsibility: the separation of powers that so cleanly distinguished American government from those that had gone before it. There were three branches, independent of each other, with varied duties and roughly equal. The greater power—overtaxing, spending, deciding whether to go to war, confirming members of the president’s Cabinet and justices of the Supreme Court—had been placed in the Congress, I said, because while the Founders had created a republic, they also added a sprinkling of democracy: The people would choose who would do the actual governing. I would underscore this point by noting the provisions that made clear the Framers’ deliberate rejection of a parliamentary system like the ones they had known in Europe, where legislative and executive power were joined. Here, it was to be the people, not the parties, that ruled, I told my students.

I believed it to be true—certainly it was what the Founders intended, and it was pretty close to the reality when I was first elected to Congress 40 years ago. But it’s no longer accurate. Instead of three equal, independent branches, each a check on the others, today’s federal government is, for practical purposes, made up of either two branches or one, depending on how you do the math. The modern presidency has become a giant centrifuge, sucking power from both Congress and the states, making de facto law through regulation and executive order. Yet the growing power of the executive is not merely a case of presidential power lust. For decades, the Supreme Court has consistently held that on most policy questions, foreign as well as domestic, statute trumps fiat (as recently as 2014’s decision Zivotofsky v. Kerry, the court declared that “the executive is not free from the ordinary controls and checks of Congress merely because foreign affairs are at issue”). But if Congress subordinates its constitutional duties to political concerns, what then?

Presidents have managed to accumulate such a prominent place at the top of what is now increasingly a pyramid rather than a horizontal structure of three connected blocks because for more than a generation, Congress has willingly abandoned both its constitutional responsibilities and its ability to effectively serve as a check on the executive even when it wishes to do so.

***

In the days after Donald Trump’s election, even after the new Congress was sworn in, congressional leaders waited eagerly to receive direction from the incoming president on budgetary, and even legislative, priorities. It is, at this point, a familiar pattern. When Barack Obama was president, Congressman Steve Israel, who had been tasked with overseeing House Democrats’ messaging, noted that Obama was, in fact, “our messenger in chief.” To a considerable extent, Republicans and Democrats in Congress have taken to seeing themselves not as part of a separate and competing branch of government, but as arms of their respective political parties.

Under the speakership of Newt Gingrich, in an attempt to demonstrate its new cost-cutting zeal, the Congress began to unilaterally disarm itself. Staffing (and thus expertise) was reduced. Foreign travel was scaled back, leaving members of Congress dependent on whatever information the executive branch wished to share with them about important international issues, or what they could discern from reading newspapers. That diminished capacity was further decreased in 2011, when Congress stripped itself of the ability to designate specific spending priorities through appropriations earmarks. As a result, members of Congress were not only deprived of an important tool for negotiating with their colleagues, but power over spending decisions—a fundamental congressional responsibility—was ceded to executive branch bureaucrats whose decisions about which government projects would be funded lacked any public transparency.

Congress’ abdication of responsibility predates Gingrich, however. The Constitution clearly provided that the United States would not send its children to fight and die in foreign wars unless the people themselves, through their elected representatives, thought the sacrifice necessary. Although presidents command the military, they do not decide when troops are to be sent into combat, except in the case of invasion or civil insurrection. Reacting to presidential overreach (beginning with Truman taking the nation to war in Korea without first seeking congressional authorization, and continuing through an undeclared war in Vietnam during the Johnson and Nixon years), Congress eventually stepped in, years later, in an attempt to reassert its authority but clumsily did the reverse: Under the 1973 War Powers Act, hailed by its sponsors as a means to restrain the executive, presidents were given free rein to go to war so long as they notified Congress first. Members of Congress congratulated themselves for providing that they could step in within 60 days to call a halt to a presidentially initiated conflict although it was not likely that Congress would pull back support with American troops engaged in combat. In that one spectacularly ill-considered action, the Congress stepped back from its single-most important obligation: deciding, as the peoples’ representatives, if and when America would wage war.

Other international issues speak to a similar trend of congressional retreat from constitutional responsibilities. During the just-ended presidential election, both Trump and Senator Bernie Sanders were sharply critical of international trade agreements that they believed disadvantage the United States. Since Congress is the only body authorized to write American law, it had acted over the years to provide safeguards on matters ranging from environmental protections to worker safety. But for decades, it has repeatedly surrendered its power to protect American interests in trade deals, bowing to presidents’ requests to simply accept whatever agreements the executive strikes with other countries (often without any congressional input). As a result, Congress has agreed to take up trade pacts on a “fast track,” denying its members the right to make any changes in the terms of the agreement, even though the Constitution explicitly gives Congress the authority to make the laws that govern international commerce. Congress has stripped itself of the power to insist that international trade be conducted in a way not harmful to American national interests and, specifically, the interests of American workers.

Americans have become accustomed to seeing Congress—especially when it’s controlled by the same party that holds the White House—wait for presidents to submit their proposed federal budgets before beginning serious discussions about spending decisions. But presidents prepare national budget proposals not because they are entitled to tell Congress what to do, but only because Congress has tasked the president with doing so in order to give the legislature a better sense of his thinking—and give members of Congress the chance to gather the information they need from the executive branch to decide how much to spend (and on what) and whether to increase taxes to pay for it. By stripping itself of sufficient resources to compete with the executive, Congress has made itself not the parent of the national budget, but a secondary player often forced by its own inadequacy to tinker at the edges with what presidents demand.

***

Of course, there remains another branch of government which, like the Congress, is theoretically and constitutionally separate and independent. But in reality the separateness is a bit fuzzy. That’s because even in this age of hyperpartisanship, there is at least one important area in which Democrats and Republicans think alike: both parties view the federal courts, and especially the Supreme Court, not as a neutral, Constitution-bound, arbiter, but as a de facto branch of the legislature.

Whether by a president, presidential candidate, or member of Congress, potential jurists are evaluated not on judicial temperament, quality of reasoning, or other examples of what were once considered “judicial attributes.” Today, the dominant question is how a nominee for the court will rule on controversial political questions. Last year, both Hillary Clinton and Trump, like presidential candidates before them, announced, as a part of their political campaigns, a “litmus test” for potential court nominees.

While many previous Supreme Court nominees were confirmed by the Senate with little or no dissent, Democrats and Republicans in today’s Senate announce their support or opposition at the instant of the nomination’s announcement—often even before a specific nominee is chosen—in anticipation of how the nominee will vote on questions of abortion, immigration, regulations, firearm ownership, and so on. Whether Merrick Garland or Neil Gorsuch, the question is not whether the nominee is qualified to function judicially, but whether he or she is “one of us”—that is, a fellow liberal or conservative. Each Supreme Court nominee is viewed as if he or she were to be a 101st vote in the Senate.

Today’s “separation of powers” is no longer between the three original, constitutionally created, branches of government, but between, on the one hand, a branch consisting of the president, his supporters in Congress and their mutual supporters on the federal bench; and on the other hand, a branch made up of the party in opposition to the president, his opponents in Congress and their co-partisans on the bench.

America’s Founders recognized the truth in Hobbes’ declaration that governments were needed to prevent abuses of the weak by the powerful. But they recognized that government, too, would need to be prevented from committing its own abuses—hence the need for the sometimes frustrating but nonetheless necessary divisions of authority between the state and federal governments and between the branches of the federal government. That is the system described in the Constitution and the system I taught. But it is not the system by which America operates today—a persistent war between competing political clubs.

I taught my students a system of government based on the Constitution. I thought I was teaching about current events. Instead, I now realize, I was teaching ancient history.

Mickey Edwards is a former eight-term member of Congress and chairman of the House Republican Policy Committee. After leaving office in 1993, he taught government for 13 years at Harvard and Princeton, and became a vice president of the Aspen Institute, where he directs a political leadership program.

https://www.facebook.com/plugins/comments.php?api_key=&channel_url=http%3A%2F%2Fstaticxx.facebook.com%2Fconnect%2Fxd_arbiter%2Fr%2Fao6eUeuGXQq.js%3Fversion%3D42%23cb%3Df255f344c7b1348%26domain%3Dwww.politico.com%26origin%3Dhttp%253A%252F%252Fwww.politico.com%252Ff2b59c2a162053c%26relation%3Dparent.parent&colorscheme=light&href=http%3A%2F%2Fwww.politico.com%2Fmagazine%2Fstory%2F2017%2F02%2Fthree-branches-government-separation-powers-executive-legislative-judicial-214812&locale=en_US&mobile=true&numposts=5&sdk=joey&skin=light&version=v2.3

 

POLITICO MAGAZINE LINKS


Don’t Hire Anyone Over 30: Ageism in Silicon Valley

If you work in Silicon Valley, you’ll be unemployed in middle age, predicts Ted Rall. Silicon Valley Ageism: Special Report.

Source: Don’t Hire Anyone Over 30: Ageism in Silicon Valley

 

—————————————

aNewDomain — Most people know that Silicon Valley has a diversity problem. Women and ethnic minorities are underrepresented in Big Tech. Racist and sexist job discrimination is obviously unfair. It also shapes a toxic, insular white male “bro” culture that generates periodic frat-boy eruptions. (See, for example, the recent wine-fueled rant of an Uber executive who mused — to journalists — that he’d like to pay journalists to dig up dirt on journalists who criticize Uber. What could go wrong?)

After years of criticism, tech executives are finally starting to pay attention — and some are promising to recruit more women, blacks and Latinos.

This is progress, but it still leaves Silicon Valley with its biggest dirty secret: rampant, brazen age discrimination.

“Walk into any hot tech company and you’ll find disproportionate representation of young Caucasian and Asian males,” University of Washington computer scientist Ed Lazowska told The San Francisco Chronicle. “All forms of diversity are important, for the same reasons: workforce demand, equality of opportunity and quality of end product.”

Overt bigotry against older workers — we’re talking about anyone over 30 here — has been baked into the Valley’s infantile attitudes since the dot-com crash 14 years ago.

Life may begin at 50 elsewhere, but in the tech biz the only thing certain about middle age is unemployment.

The tone is set by the industry’s top CEOs. “When Mark Zuckerberg was 22, he said five words that might haunt him forever. ‘Younger people are just smarter,’ the Facebook wunderkind told his audience at a Y Combinator event at Stanford University in 2007. If the merits of youth were celebrated in Silicon Valley at the time, they have become even more enshrined since,” Alison Griswold writes inSlate.

It’s illegal, under the federal Age Discrimination in Employment Act of 1967, to pass up a potential employee for hire, or to fail to promote, or to fire a worker, for being too old. But don’t bother telling that to a tech executive. What used to be a meritocracy has become a don’t-hire-anyone-over-30 situation (certainly not over 40) — right under the nose of the tech media.

Which isn’t surprising. The supposed watchdogs of the Fourth Estate are wearing the same blinders as their supposed prey. The staffs of news sites like Valleywag and Techcrunch skew as young as the companies they cover.

A 2013 BuzzFeed piece titled “What It’s Like Being The Oldest BuzzFeed Employee” (subhead: “I am so, so lost, every workday.”) by a 53-year-old BuzzFeed editor “old enough to be the father of nearly every other editorial employee” (average age: late 20s) reads like a repentant landlord-class sandwich-board confession during China’s Cultural Revolution: “These whiz-kids completely baffle me, daily. I am in a constant state of bafflement at BF HQ. In fact, I’ve never been more confused, day-in and day-out, in my life.” It’s the most pathetic attempt at self-deprecation I’ve read since the transcripts of Stalin’s show trials.

A few months later, the dude got fired by his boss (15 years younger): “This is just not working out, your stuff. Let’s just say, it’s ‘creative differences.’”

Big companies are on notice that they’re on the wrong side of employment law. They just don’t care.

Slate reports: “In 2011, Google reached a multimillion-dollar settlement in a … suit with computer scientist Brian Reid, who was fired from the company in 2004 at age 54. Reid claimed that Google employees made derogatory comments about his age, telling him he was ‘obsolete,’ ‘sluggish,’ and an ‘old fuddy-duddy’ whose ideas were ‘too old to matter.’ Other companies — including Apple, Facebook, and Yahoo — have gotten themselves in hot water by posting job listings with ‘new grad‘ in the description. In 2013, Facebook settled a case with California’s Fair Employment and Housing Department over a job listing for an attorney that noted ‘Class of 2007 or 2008 preferred.’”

Because the fines and settlements have been mere slaps on the wrist, the cult of the Youth Bro is still going strong.

To walk the streets of Austin during tech’s biggest annual confab, South by Southwest Interactive, is to experience a society where Boomers and Gen Xers have vanished into a black hole. Photos of those open-space offices favored by start-ups document workplaces where people over 35 are as scarce as women on the streets of Kandahar. From Menlo Park to Palo Alto, token forty-somethings wear the nervous shrew-like expressions of creatures in constant danger of getting eaten — dressed a little too young, heads down, no eye contact, hoping not to be noticed.

“Silicon Valley has become one of the most ageist places in America,” Noam Scheiber reported in a New Republic feature that describes tech workers as young as 26 seeking plastic surgery in order to stave off the early signs of male pattern baldness and minor skin splotches on their faces.

Whatever you do, don’t look your age — unless your age is 22.

Scheiber continues, “Robert Withers, a counselor who helps Silicon Valley workers over 40 with their job searches, told me he recommends that older applicants have a professional snap the photo they post on their LinkedIn page to ensure that it exudes energy and vigor, not fatigue. He also advises them to spend time in the parking lot of a company where they will be interviewing so they can scope out how people dress.”

Paul Graham, the head of the most prominent start-up incubator, told The New York Times that most venture capitalists in the Valley won’t take a pitch from anyone over 32.

In early November, VCs handed over several hundred thousand bucks to a 13-year-old.

Aside from the legal and ethical considerations, does Big Tech’s cult of youth matter? Scheiber says hell yes: “In the one corner of the American economy defined by its relentless optimism, where the spirit of invention and reinvention reigns supreme, we now have a large and growing class of highly trained, objectively talented, surpassingly ambitious workers who are shunted to the margins, doomed to haunt corporate parking lots and medical waiting rooms, for reasons no one can rationally explain. The consequences are downright depressing.”

One result of ageism that jumps to the top of my mind is brain drain. Youthful vigor is vital to success in business. So is seasoned experience. The closer an organization reflects society at large, the smarter it is.

A female colleague recently called to inform me that she was about to get laid off from her job as an editor and writer for a major tech news site. (She was, of course, the oldest employee at the company.) Naturally caffeinated, addicted to the Internet and pop culture, she’s usually the smartest person in the room. I see lots of tech journalism openings for which she’d be a perfect fit, yet she’s at her wit’s end. “I’m going to jump off a bridge,” she threatened. “What else can I do? I’m 45. No one’s ever going to hire me.” Though I urged her not to take the plunge, I couldn’t argue with her pessimism. Objectively, though, I think the employers who won’t talk to her are idiots. For their own sakes.

Just a month before, I’d met with an executive of a major tech news site who told me I wouldn’t be considered for a position due to my age. “Aside from being stupid,” I replied, “you do know that’s illegal, right?”

“No one enforces it,” he said, shrugging it off. And he’s right. The feds don’t even keep national statistics on hiring by age.

The median American worker is age 42. The median age at Facebook, Google, AOL and Zynga, on the other hand, is 30 or younger. Twitter, which recently got hosed in an age discrimination lawsuit, has a median age of 28.

Big Tech doesn’t want you to know they don’t hire middle-aged Americans. Age data was intentionally omitted from the recent spate of “we can do better” mea culpa reports on company diversity.

It’s easy to suss out why: they prefer to hire cheaper, more disposable, more flexible (willing to work longer hours) younger workers. Apple and Facebook recently made news by offering to freeze its female workers’ eggs so they can delay parenthood in order to devote their 20s and 30s to the company.

The dirty secret is not so secret when you scour online want ads. “Many tech companies post openings exclusively for new or recent college graduates, a pool of candidates that is overwhelmingly in its early twenties,” Verne Kopytoff writesin Fortune.

“It’s nothing short of rampant,” said UC David’s Comp Sci Professor Norm Matloff, about age discrimination against older software developers. Adding to the grim irony for Gen Xers: today’s forty-somethings suffer reverse age discrimination at the hands of Boomers in charge when they were entering the workforce.

Once too young to be trusted, now too old to get hired.

Ageist hiring practices are so over-the-top illegal, you have to wonder: Do these jerks have in-house counsel?

Kopytoff: “Apple, Facebook, Yahoo, Dropbox, and video game maker Electronic Arts all recently listed openings with ‘new grad’ in the title. Some companies say that recent college graduates will also be considered and then go on to specify which graduating classes—2011 or 2012, for instance—are acceptable.”

The feds take a dim view of these ads.

“In our view, it’s illegal,” Raymond Peeler, senior attorney advisor at the Equal Employment Opportunity Commission, told Kopytoff. “We think it deters older applicants from applying.” Gee, you think? But the EEOC has yet to smack a tech company with a big fine.

The job market is supposed to eliminate efficiencies like this, where companies that need experienced reporters fire them while retaining writers who are so wet behind the ears you want to check for moss. But ageism is so ingrained into tech culture that it’s part of the scenery, a cultural signifier like choosing an iPhone over Android. Everyone takes it for granted.

Scheiber describes a file storage company’s annual Hack Week, which might as well be scientifically designed in order to make adults with kids and a mortgage run away screaming: “Dropbox headquarters turns into the world’s best-capitalized rumpus room. Employees ride around on skateboards and scooters, play with Legos at all hours, and generally tool around with whatever happens to interest them, other than work, which they are encouraged to set aside.”

No matter how cool a 55-year-old you are, you’re going to feel left out. Which, one suspects, is the point.

It’s impossible to overstate how ageist many tech outfits are.

Electronic Arts contacted Kopytoff to defend its “new grad” employment ads, only to confirm their bigotry. The company “defended its ads by saying that it hires people of all ages into its new grad program. To prove the point, the company said those accepted into the program range in age from 21 to 35. But the company soon had second thoughts about releasing such information, which shows a total absence of middle-aged hires in the grad program, and asked Fortune to withhold that detail from publication.” (Fortune declined.)

EA’s idea of age diversity is zero workers over 35.

Here is one case where an experienced forty-, or fifty-, or even sixty-something in-house lawyer or publicist might have saved them some embarrassment — and legal exposure.

In the big picture, Silicon Valley is hardly an engine of job growth; they haven’t added a single net new job since 1998. “Big” companies like Facebook and Twitter only hire a few thousand workers each. Instagram famously only had 13 when it was purchased by Facebook. They have little interest in contributing to the commonwealth. Nevertheless, tech ageism in the tiny tech sector has a disproportionately high influence on workplace practices in other workspaces. If it is allowed to continue, it will spread to other fields.

It’s hard to see how anything short of a massive class-action lawsuit — one that dings tech giants for billions of dollars — will make Big Tech hire Xers, much less Boomers.

For aNewDomain, I’m Ted Rall.

Aweinspiring animals: Cat vs alligator, Animals rescue other species of animals, amazing animal relationships

 

Newfound respect and Awe for Cats as an amazing creature: Cat vs (Racoon, Aligator, Dog, Snake, Poosum, etc..)

Cat vs R(acoon, Aligator, Dog, Snake, Poosum, etc…)

 

 

Animals rescue animals of a different species.
Cross species animals as friends
Cat acts as guard dog to keep alligators away.
Young Cheetah mothers baby monkey.

 

This one has some annoying parts, but just skip forward at that point. There is amazing footage intermixed, through to the end.

 

 

His “Best Friend” is a Bear

Anderson worked for the Montana Department of Fish, Wildlife and Parks as a wildlife rehabilitation technician, and for several privately owned wildlife parks as an animal keeper and trainer. In 2002, he adopted an orphaned grizzly bear cub, Brutus, from an overcrowded wildlife park where the cub was destined to spend his life in captivity or be euthanized. This led to Anderson’s future career as trainer to Brutus and co-owner of Montana Grizzly Encounter which is a sanctuary that rehabilitates grizzlies rescued from bad captivity situations and which aids in the study of grizzlies.

 

 

Cat Scares Bear Away

The Space Without

http://www.theatlantic.com/sponsored/qualcomm/the-space-without/542/?sr_source=lift_outbrain

 

Using Computer Vision and AI to interpret the world for the visually impaired.

Same technology that brings us facial recognition, and self driving cars.

 

 

—–

Visualized by Jeff Nishinaka

Source: The Space Without