These eight short story collections would make excellent sci-fi anthology shows

http://bit.ly/2VMzmeU

Since the beginning of the modern science fiction genre, authors have built careers on writing short stories, for magazines and anthologies — and more recently — on websites. While those works don’t quite get the same attention as a novel, collections of an author’s short fiction has long been a good way to catch up on their published repertoire. Recently, there’s been more attention on shorter fiction thanks to projects such as Netflix’s Love, Death + Robots, and a new anthology series based on horror author Nathan Ballingrud’s fantastic collection, North American Lake Monsters.

What’s more, a number of anthology shows have popped up over the years on a variety of streaming services. Netflix and Channel Four produced Black Mirror; CBS recently brought back The Twilight Zone; HBO is running Room 104; Amazon adapted a variety of stories from Philip K. Dick for Electric Dreamsl and Hulu has its horror-themed Dimension 404. There are other projects on the horizon as well: AMC began developing a series based on Ted Chiang’s story “Liking What You See: A Documentary”, which was featured in his 2002 collection, Stories of Your Life and Others, and set up a writer’s room for a show based on the short stories by Ken Liu.

It’s easy to see why anthology shows based on short stories are appealing: they don’t represent a whole lot of commitment from viewers, and provide a lot of variety. A science fiction writer’s collection of short stories can provide both: self-contained, bite-sized narratives that can play out in 20-40 minutes. Don’t like one? Skip to the next. With word that Ballingrud’s debut collection is in the works, we had some ideas for other single-author collections that might make for a good anthology series in their own right.

Image: Tor.com

Six Months, Three Days, Five Others by Charlie Jane Anders

io9-cofounder Charlie Jane Anders has forged a notable career for herself in recent years with a number of fantastic short stories and two excellent novels (disclaimer: I used to work for her at io9) and released a short collection called Six Months, Three Days, Five Others through Tor.com.

It’s a small collection, but each of the stories pack a punch, from “The Fermi Paradox Is Our Business Model” about an alien civilization that seeds the galaxy with life, and waits for them to burn themselves out, in order to cheaply extract resources. The title story “Six Months, Three Days” earned Anders a Hugo Award in 2012, and is an emotional story about a woman who can see all possible futures, and a man who can see one true future. This book would make for a great short-run series. At one point, “Six Months, Three Days” was in the works for a TV adaptation as well.

This short collection would make for a great series of emotional and thought-provoking episodes.


Image: Folio Society / Alexander Wells

I, Robot by Isaac Asimov

Forget the 2004 “adaptation” of Isaac Asimov’s collection of robot stories. That film was a thriller that used a bunch of the bigger ideas that the author came up with over the years, but doesn’t really adapt any of the stories.

The original short story collection contains 10 of Asimov’s classic robot stories, each of which revolve around a central premise: The Three Laws of Robotics that govern the behavior of his robots. Each story deals with a loophole in that programming, from “Runaround,” about a mining robot on Mercury that gets stuck in a loop; “Liar!” about a robot that causes problems when it doesn’t want to hurt a couple of humans’ feelings; and “Evidence,” a story about a politician who is accused of secretly being a robot.

The entire collection would make for a fantastic anthology series, one that deals with the ramifications of technology and how it can break.


Image: Night Shade Books

Pump Six and Other Stories by Paolo Bacigalupi

If Black Mirror is anything to go by, audiences will tune in for extremely bleak science fiction. One good example of this comes in the form of Paolo Bacigalupi’s collection, Pump Six and Other Stories.

Bacigalupi is best-known for books like The Windup Girl and The Water Knife, which have some pretty bleak portrayals of the future of our planet. That tendency carries over in this book: his story “The People of Sand and Slag” is about a trio of genetically modified humans guarding a a mining corporation in the distant future. When they discover that an “intruder” is really a dog, they try and keep it alive. It doesn’t go well. Another, “The Tamarisk Hunter” about a bounty hunter named Lolo who’s tasked with finding and killing water-thirsty tamarisk trees in a California gripped by drought.

This wouldn’t be a happy series, but it would make for a great, pointed show about the dangers of climate change.


Image: Orbit Books

How Long ‘Til Black Future Month? by N.K. Jemisin

This was one of our favorite books that came out from last year: N.K. Jemisin’s collection of short stories, which span the breadth of cyberpunk, epic fantasy, and hard science fiction, all of which provides some pointed commentary on the inequality present throughout the world.

This particular book would make for a great series, with stories like “The City, Born Great,” following the personification of New York City, and “The Ones Who Stay and Fight” about a utopian society where knowledge of inequality is forbidden.

The collection is a timely and relevant body of work, and a series based on this book would sit nicely alongside something like Hulu’s The Handmaid’s Tale.


Image: Talos Press

Tomorrow Factor by Rich Larson

Rich Larson has become one of my favorite short story authors working right now (disclaimer: he provided a story for an anthology I edited, War Stories: New Military Science Fiction), and last year, he released a collection called Tomorrow Factory, which pulls together 23 of his recent short stories.

Larson’s stories are quite a bit of fun to read, and cover a lot of territory: cyberpunk adventures about an orphaned albino girl who discovers a mech in the midst of a garbage dump in “Ghost Girl,” or about a basketball scout who discovers that a prospect, Oxford Diallo, isn’t quite what he appears in “Meshed”, to space opera like “The Ghost Ship Anastasia,” about a starship repair crew that runs into all sorts of problems on one difficult mission.

These stories would make for a really fun, dynamic series about how we use technology.


Image: Saga Press

The Unreal and the Real by Ursula K. Le Guin

If there’s one classic author whose work would make for a fantastic anthology series, it’s Ursula K. Le Guin. She hasn’t had a great experience with adaptations — the less said about the SCI FI Channel’s adaptation of Earthsea, the better. But her stories are really fantastic, and another attempt would probably go over a lot better now.

She’s released a number of collections in the last couple of years, but one recent one is a reissue of The Unreal and the Real, which contains nearly 40 stories, broken into stories that are set in a realistic world, while others are set in more fantastic locations, like her world of Earthsea, or in her larger expanded Hanish space opera universe.

This collection — or others that she’s published — would provide a solid basis for a brilliant series of short stories that reflect on the morality of society and cultures here on Earth or on distant worlds.


Image: Head of Zeus

The Wandering Earth by Cixin Liu

Cixin Liu might be most famous for his novel The Three-Body Problem and its sequels, but he’s also released a number of short stories over the years, which have been collected into a book, The Wandering Earth.

If that title sounds familiar, it’s because the story that it’s based on was recently turned into China’s first big science fiction film, which you can now watch on Netflix. It’s a big, epic space disaster story, and there are other big stories like “Devourer”, about an alien ship that floats through space consuming planets, “Mountain”, about a group of aliens trapped in a bubble of rock, who try and discover what lies beyond their world, and “Sun of China,” about a boy from a rural town who grows up to become an astronaut on a solar installation in orbit.

Liu’s stories are often described as a throwback to the genre’s classic age, and this book (minus Wandering Earth) could make for a fantastic series about some epic adventures in outer space, provided you had the right budget.


Image: Karen Traviss

View of a Remote Country by Karen Traviss

I first came across Karen Traviss through her Wess’Har War and Star Wars novels, but for a several of years, she published a number of fantastic short stories in a variety of publications which she later collected into a self-published collection, View of a Remote Country.

There are some really fascinating stories in this book: “Suitable for the Orient” follows a doctor who’s stationed on a distant planet amidst a conflict between the native lifeforms and the human colonists, while “An Open Prison” depicts a future where convicts are locked up in a mechanical suit and are forced to serve the public and the people they’ve wronged.

I’ve often found Traviss’s stories to be interesting mediations on people and technology, and the pitfalls between them.



from The Verge http://bit.ly/1jLudMg
via IFTTT

A Pastor’s Case for the Morality of Abortion

http://bit.ly/2MasBEl

In America, the debate about abortion is often reduced to binary categories. Religious versus secular. Misogynists versus murderers. Even “Christian theocracy” versus, presumably, everyone else.

With abortion once again in the headlines this month, after Alabama and several other states passed near-total bans on the procedure, Jes Kast, a pastor in the United Church of Christ, spoke up as someone who does not fit those categories. She supports abortion rights, and is representative of her denomination on this issue: According to the Pew Research Center, 72 percent of people in the UCC, a small, progressive denomination with a little less than 1 million members, think abortion should be legal in all or most cases. Kast also serves on the clergy-advocacy board of Planned Parenthood, which works to “increase public awareness of the theological and moral basis for advocating reproductive health,” according to its website.

Kast has not always supported abortion, however—far from it. She grew up in a conservative-Christian community in West Michigan, attended an evangelical church as a teenager, and participated in anti-abortion protests. Her process of coming to support abortion rights has been long, and definitive: Kast no longer believes there are any circumstances under which it is immoral to get an abortion. She has been open about her views with members of her new church in State College, Pennsylvania, and told me she would feel comfortable preaching about abortion from the pulpit.

Kast’s experience shows how widely people’s moral perspectives on abortion can vary, including among clergy. Although she has clear views on abortion, she lives in community with people who see the issue very differently. Part of her job, and her life, is to navigate those differences with care, which can sometimes be complicated. I asked Kast about how her views have changed, what it’s like disagreeing with her conservative Christian family, and why she believes scripture justifies abortion. Our conversation has been edited for length and clarity.


Emma Green: When you were growing up, what did you think about the morality of abortion?

Jes Kast: My family has deep roots in the pro-life movement. When I was a child, before I even knew this language of pro-life and pro-choice, my family would talk with vigor about protecting the unborn. I heard that at church. I heard that at the dinner table. One of my family members had a sweater that said, “Endangered Species,” with all of the different animals. One of the pictures was a fetus inside of a womb.

That’s what it meant to be Christian: to protect the unborn.

Green: Did you engage in any activism around this issue?

Kast: The first protest I ever went to was when I was 12. It was an anti-abortion protest. We lined the streets in my small Michigan town, with signs—something along the lines of save the unborn babies. It was a silent protest.

Green: When would you say you first started questioning the values you had been taught around abortion?

Kast: It began with other issues, which led to abortion. I was in college, a private Christian school in Michigan. At that time, President [George W.] Bush was talking about [weapons of mass destruction].

I remember sitting with my mom and my dad at Chili’s. And I said, “I don’t believe that there are WMDs, and I’m not sure I trust President Bush.” In that mindset, to be Republican is to be Christian. That all went together. And I began questioning it.

Green: How did that connect with the question of abortion?

Kast: I began to understand myself as a woman in ministry. I began to see myself as this Christian feminist. I began to own my voice differently, and to question the rules of engagement of Christianity that I was raised with.

I began saying things like, “Why is it that abortion is the only issue that my parents and family really care about?” I have a very good relationship with my family. I’m not trying to paint them in negative light. But why? Why is this the only issue?

Like many Millennials coming out of evangelicalism, I began to care about different justice issues. I began to care about the Earth, and racial justice, and interfaith justice. And one of the topics that arose for me was abortion.

I began questioning: What about bodily autonomy? Isn’t that justice? How would God ever infringe upon that?

And this was a big one for me: Why is it that when it comes to this topic, it’s almost always white, straight, Christian men who are the loudest?

[Read: Rachel Held Evans, hero to Christian misfits]

Green: How would you describe your views on abortion today?

Kast: When I was serving on the Upper West Side, my church and different synagogues in the area got together for the One Voice to Save Choice event.

I remember approaching my board, saying, “Is this who we are? Can I go, as a pastor?”

And it was unequivocal: Yes.

Cecile Richards, the previous president of Planned Parenthood, was there. It was a formative moment for me. Here we are, different faithful people of different creeds coming together to say bodily autonomy and reproductive rights are justice issues. That was a tipping point for me.

I believe reproductive rights and bodily autonomy are deeply important. I believe that is faithfulness to Christianity. I believe in access to safe and legal abortions. I believe that the person who can best make these decisions is the person who's considering these decisions.

I meet one-on-one with people in my congregation. Although I am ordained, and I carry a certain authority with me, my job is to walk with people through those decisions. I have known people who have accessed abortion and reproductive care. Some haven’t had any emotional turmoil over it; it has been like more celebration for them. And I know people who saw it as a hard decision.

I believe every person I encounter, including myself, has the right to their body. When that bodily autonomy is taken away, to me, that is against Christian scripture, and is against the Gospel I believe in.

Green: So, just to be clear, what do you think is the Christian theological argument for abortion?

Kast: When people talk about, “Our body is a temple of God, and holy,” I see that as, I have the right to choices over my body, and the freedom to make the decisions that are right for me.

In Genesis, it says that God breathed God’s spirit into our lives—Christians would say “the Holy Spirit.” Because of that, we’re not puppets controlled by God. Because of the image of God in us, we have freedom. That’s what’s really clear to me, is freedom.

There’s this little passage in the Gospel of John that continues to stay with me. Jesus says, “I have come that they might have life and have it abundantly.” The Greek word that’s used there for “life abundance” is this word zoe, which means not just that you’re living and breathing, but that God’s plan for our lives is to actually have a meaningful life with loving contentment and satisfaction.

Because of that—because I value life, and I believe Jesus values life—I value the choices that give us the type of life that we need.

Green: I often speak to people in what you might call a gray space on abortion. They might say something like, “I believe in a legal right to a safe and accessible abortion. But on a personal, one-on-one level, I believe in encouraging people to choose to carry pregnancies to term.”

Would you say that perspective resonates with you, especially in those pastoral-counseling contexts?

Kast: No. I still think encouraging someone to carry a fetus and give birth to a baby might not be the most life-giving decision. For instance, let’s take a more extreme case: a 12-year-old who gets raped. I think it’s evil to ask that 12-year-old to carry that baby to term. I don’t think that’s life. I don’t think that’s valuing a 12-year-old’s life.

Green: Do you think there’s any context in which it’s immoral to have an abortion?

Kast: That’s a really great question. Let me think if I do think that or not. Let me just be really thoughtful about that.

Green: Okay.

Kast: I don’t. I really don’t. I don’t think I do. For me, it’s a health-care issue. The best person to make that decision is the person who has to decide that. And if that person believes it’s immoral for them, then I would have to honor the conscience of that person and walk with them through what they would choose.

Green: You talked earlier about this view that was imparted to you in your childhood—that to be Christian is to be opposed to abortion. Do you believe that Christians have to be opposed to abortion?

Kast: No. No. Like, not at all.

In some ways I feel I have repented from a view of Christianity that I don’t believe is true anymore. I believe I am walking in faithfulness.

I think there’s this view that progressive liberal Christians don’t take scripture or theology seriously. That couldn’t be farther from the truth. I take scripture and theology, I believe, more seriously now.

[Read: The progressive roots of the pro-life movement]

Green: Would you say there is space in your church for a vocally pro-life person?

Kast: I actually thrive in places where not everyone has the same opinions as I do. When I look out into the congregation, I don’t expect everybody to agree with me. I am their pastor. Whomever they voted for, whatever their values are around abortion or whatever issue, I will be their pastor and love them. So yes, there’s room.

I’ve also been very transparent about who I am. I try to be honest about that. I’m playing with this phrase—conviction without certainty. I am Christian, and I follow this guy named Jesus who said, above all, love your God with your whole heart and soul and mind, and love your neighbor as yourself. And for me, that includes the people who didn’t vote like me, who hold different opinions than me. That is important to me.

Green: What would your parents think of your views on abortion today?

Kast: My parents know. They don’t understand. They ask questions. They love their daughter, and they’re very good at loving their daughter, and they would disagree with me—I mean, probably strongly. I think anti-abortion conversations are still probably one of the No. 1 things my parents value in their understanding of Christianity. And they couldn’t be prouder of their daughter who is a minister.

Green: How do you wish abortion was talked about in Christian circles in the United States?

Kast: I wish there was more clarity of conviction with compassion. I wish one section of Christianity didn’t demonize another section of Christianity, because there are Christians like myself, and like my denomination, who see safe and legal abortion access as part of what it means to do justice. We are deeply faithful Christian people. I would love that respect from my more conservative siblings in faith.

I value a more nuanced conversation. I value thoughtfulness a lot. And I wish those who are considering the choices in front of them were honored and respected, and that government and institutions and even God doesn’t have the final say over how we make the choices that are best for us.



from Politics : The Atlantic http://bit.ly/2eqp6Ue
via IFTTT

Dungeons and Back Alleys: The Fate of the Mentally Ill in America

http://bit.ly/2K1Fb5X


Note to readers: This is a companion piece to the first of Awais Aftab, MD's interview series, Conversations in Critical Psychiatry: Allen Frances, MD. We thank both Dr Aftab and Dr Frances for their contribution.

 

COMMENTARY

Dr Frances is Professor Emeritus and former Chair, Department of Psychiatry, Duke University; Chair, DSM-IV Task Force; and author of Saving Normal and Essentials of Psychiatric Diagnosis.

My career is ending on a sour note. It is hard to be complacent when 600,000 people who should be our patients are instead languishing as prisoners or sleeping on streets. County jails are now the biggest providers of psychiatric care for people suffering from severe mental illness. And our patients do particularly poorly in jail—enduring long stays, frequently in crazy-making solitary confinement and often targeted for physical and sexual abuse. I have seen long rows of cells, in each of which is a desperate mentally ill occupant who has smeared excrement all over the walls and windows.

We have no excuse for collectively failing the patients who need us most. It is easy enough for each of us to blame the system—the government neglect, professional association passivity, and advocacy groups’ loss of mission—but I also blame myself for having done far too little, far too late. We are all part of the system and must take personal responsibility for its miserable performance.

We won’t be able to correct this horrible mess without understanding its history. The mental hospitals established by the states in the 19th century had the best of intentions and the worst of consequences. Their goal was the humane treatment of the mentally ill following the principles promulgated by the father of modern psychiatry, Phillipe Pinel. These asylums were meant to be places of peaceful rural retreat, providing safety and a meaningful life for psychiatric patients who had no place in the rapidly growing cities. The hospitals were self-contained communities, much more civilized and welcoming that the chaotic urban environments. There were workshops and surrounding farms that allowed patients to learn skills and feel productive. Hospital directors, staff, and families lived on the grounds and broke bread with the patients. The architecture of the buildings was usually strikingly beautiful, and they were surrounded by lovely bucolic settings.

The 20th century witnessed a rapid and thorough degradation of the system, with cattle-car overcrowding and system-wide patient neglect. Professionalizing the staff depersonalized the care. Growing cities surrounded and swallowed up hospital grounds, restricting patients to endless days in ugly, packed, stench-filled wards. Unions resented job competition from unpaid or low paid patients and pressured to have workshops closed. The well-meaning asylums had degenerated into dreadful snake pits.

My first experience in psychiatry occurred 55 years ago as a medical student in one of these state hospitals. It was degrading and disgusting—an overwhelming smell of urine, neglected patients screaming and posturing, a demoralized and disengaged staff, disappearing doctors.

The “deinstitionalization” movement meant to correct this chaos arose from a strange combination: public outrage; a new model of community psychiatry; the discovery of powerful new drugs; Kennedy family guilt; and state government greed. Three books were especially influential. The Snake Pit, a semi-autobiographical 1946 novel by Mary Jane Ward (made into an acclaimed 1948 movie), vividly presented a first-hand account of the sufferings of terrorized patients.1The Myth of Mental Illness, written in 1961 by libertarian psychiatrist Tom Szasz, made the moral and legal argument that patients are citizens with civil rights that must be respected.2 Published in the same year, sociologist Erving Goffman’s Asylums revealed that total institutionalization made patients much sicker and more dependent than they would be in any less crazy-making environment.3

Community psychiatry envisioned an attractive alternative life for the mentally ill—symptoms stabilized by the new antipsychotic meds, living independently in the community, re-socialized and working productively. John Kennedy’s exposure to his sister’s mental illness motivated him to support a comprehensive mental health bill that provided funding for community health centers (CMHCs) throughout the United States. State governments were all too eager to close the enormous mental hospitals that were usually their biggest budget line item.

Deinstitutionalization was an ugly business. Patients who had been in hospital for decades were often dumped on the street with just one week’s warning. A man was admitted to my acute inpatient unit the day after he had been discharged from the state hospital that had been his home and business for 22 years. He had achieved great status and relative affluence washing staff member’s cars. Now, without any transition or support, he was disconsolate and could not picture making a new life for himself. A few days later, I had to cut him down after he had hung himself in the bathroom.

CMHCs eventually did live up to expectations and were a thrilling place to work. We saw many of our patients flourish, away from the toxic state hospital environment. Seemingly chronic symptoms were reduced via a happy combination of the new meds, rehab, and socialization. By the 1970s, the United States was the pioneer and world leader in deinstitionalization and community psychiatry.

Disclosures: 

The author reports no conflicts of interest concerning the subject matter of this article.

References: 

1. Ward MJ. The Snake Pit. New York: Random House; 1946.

2. Szasz TS. The Myth of Mental Illness. New York: Random House; 1961.

3. Goffman E. Asylums: Essays on the Social Situation of Mental Patients and Other Inmates. New York: Doubleday; 1961.

4. Fuller D, Sinclair E, Gelled J, et al: Going, Going, Gone: Trends and Consequences of Eliminating State Psychiatric Hospital Beds, Report of Treatment Advocacy Group. June 2016. https://www.treatmentadvocacycenter.org/going-going-gone. Accessed May 15, 2019.

 



from Hacker News http://bit.ly/YV9WJO
via IFTTT

New Graham & Doddsville Issue: John Hempton, Yen Liow, Bill Stewart

http://bit.ly/2EqDpI8

Columbia Business School is out with the Spring 2019 issue of its Graham & Doddsville newsletter.  It features interviews with Bronte Capital's John Hempton, Aravt Global's Yen Liow, and Stewart Asset Management's Bill Stewart.

It also includes student investment pitches such as long Dollarama, long Align Technology, long Carsales.com, and long Dean Foods 6.50% senior unsecured.

Embedded below is the Spring 2019 issue of Graham & Doddsville:



You can download a .pdf link here.



from Market Folly http://bit.ly/NLWANl
via IFTTT

John Wick 3 director says the Wachowskis are working on another Matrix film

http://bit.ly/2VeRHRu

In Hollywood, sequels, remakes, and reboots of big franchises have always been big business, and over the years, there’s been a lot of chatter about the potential comeback of one big property: The Matrix. A new report throws a bit more fuel on that particular fire, as John Wick: Chapter 3 director Chad Stahelski says the Wachowski siblings are working on a new installment in the series.

Rumors of a reboot or continuation of the franchise have circled for a couple of years. In 2017, The Hollywood Reporter suggested that Warner Bros. was working on a reboot of some sort, with Michael B. Jordan (Black Panther) potentially attached and Zak Penn (Ready Player One) tapped to write. Notably, that report indicated that the Wachowskis weren’t attached to the project.

Speaking to Yahoo! Movies, Stahelski suggests that not only is a Matrix film in the works, but that the Wachowskis are involved and that it would be a continuation of the original three films. (Clearly, someone read my former colleague Kwame Opam’s take that they should expand the world, rather than remake it from scratch.) Stahelski said:

I’m super happy that the Wachowskis are not just doing a Matrix, but they’re expanding what we all loved, and if it’s anywhere near the level of what they’ve already done, it wouldn’t take more than a call to go, ‘Hey, we want you to be a stunt guy’ and I would probably go and get hit by a car.

A representative for Stahelski told The Verge that the quote is “inaccurate” and that he did not say that the Wachowskis are doing another installment, but that “his comments were talking about if they were, he’d love to be a part of it.”

Whether or not the statement is accurate, it’s obviously, the type of situation to approach with a salt shaker at the ready: directors and actors say things all the time while speaking with the press, and Stahelski does hedge a bit, noting that he’s not sure whether Lana Wachowski will be involved as a director. There have been rumors that the sisters will be retiring after they sold their production office in Chicago last year.

The original Matrix debuted in theaters 20 years ago, and the Wachowskis followed up with a pair of sequels, The Matrix: Reloaded and The Matrix: Revolutions, to mixed reviews. They also oversaw a series of comics; a video game, The Matrix Online, which was set in the aftermath of Revolutions; and The Animatrix, a series of animated shorts set in the Matrix universe. Revolutions certainly leaves open the possibility of more stories as the war between humanity and machines came to a close with the entire Matrix getting a reboot. The Oracle and Architect’s final scene shows that there’s still some tension, but the peace will last “as long as it can.”

Presumably, any sequel or tie-in could explore the nature of the post-Revolution world and those tensions between humanity and the machines. Or it could deal with the Matrix breaking down and what post-Matrix life looks like. Certainly, having the Wachowskis return to direct or oversee that reboot would be preferable, given how they’ve directed and conceptualized the philosophy and world.

But the original creators aren’t necessarily a requirement for any continuation. Disney’s reboot of the Star Wars franchise went forward without the explicit help of George Lucas, whose summaries of a post-Return of the Jedi trilogy were essentially tossed out in favor of J.J. Abrams’ vision for the future of the franchise. But The Matrix has always had the Wachowskis’ special brand imprinted on its world, so it’s hard to see how any future of the franchise wouldn’t involve them.

Updated May 10th, 2019, 2:00PM ET: Updated to include statement from Stahelski’s representative.



from The Verge http://bit.ly/1jLudMg
via IFTTT

Jeff Bezos unveils his sweeping vision for humanity’s future in space

http://bit.ly/2EaW3DN

74 with 41 posters participating
  • Jeff Bezos, owner of Blue Origin, spoke Thursday during an unveiling of his Blue Moon lunar lander.

    Mark Wilson/Getty Images

  • During the speech, he showed off a full-scale mock-up of the Blue Moon lander on stage.

  • Bezos said the lander could be used to carry humans down to the surface of the Moon.

    Mark Wilson/Getty Images

  • The company plans to hotfire the lander's engine, named BE-7, this summer for the first time.

    Mark Wilson/Getty Images

  • Bezos also extolled the virtues of O'Neill cylinders, shown on screen at left.

    Mark Wilson/Getty Images

  • Here's a diagram of the new BE-7 engine.

    Blue Origin

WASHINGTON D.C.—The world's richest person, Jeff Bezos, unveiled his sweeping vision for humanity on Thursday afternoon in a Washington D.C. ballroom. With the lights dimmed, Bezos spoke on stage for an hour, outlining plans for his rocket company, Blue Origin, and how it will pave the way to space for future generations.

We have seen bits and pieces of Bezos' vision to use the resources of space to save Earth and make it a garden for humans before. But this is the first time he has he stitched it together in such a comprehensive and radical narrative, starting with reusable rockets and ending with gargantuan, cylindrical habitats in space where millions of people could live. This was the moment when Bezos finally pulled back the curtain, in totality, to reveal his true ambitions for spaceflight. This is where he would like to see future generations one day live.

His speech felt akin to the talk SpaceX founder Elon Musk delivered at an international space conference in 2016. Mexico City is where Musk first unveiled a design for a super-large rocket and starship, as well as his plans for millions of humans to live on Mars and make a vibrant world there.

The grandiosity of Bezos and Musk's visions are similar, and both billionaires believe the first step must involve sharply reducing the cost of access to space. This is why both SpaceX and Blue Origin have, as their core businesses, large reusable rockets.

But their visions also differ dramatically. Musk wants to turn Mars green and vibrant to make humanity a multi-planet species and provide a backup plan in case of calamity on Earth. Bezos wants to preserve Earth at all costs. "There is no Plan B," the founder of Amazon said Thursday.

First, the news

As part of his speech, Bezos revealed new details about a large lunar lander, called "Blue Moon," capable of delivering up to 3.6 tons of cargo and scientific experiments to the lunar surface. Blue Origin has spent three years working on the vehicle, he said.

The company also has a brand-new engine, not previously known, named BE-7 that has 10,000 pounds of thrust. It will power the Blue Moon vehicle during its descent to the lunar surface. The company will perform its first hotfire test of the BE-7 engine this summer in West Texas, Bezos said.

VIDEO

Introducing Blue Moon.

Near the end of his speech, Bezos praised the goal set by Vice President Mike Pence of landing humans on the Moon by 2024. "I love this," Bezos said. "It's the right thing to do. We can help meet that timeline but only because we started three years ago. It's time to go back to the Moon—this time to stay."

In a configuration with "stretch tanks," Bezos said Blue Moon could carry up to 6.5 tons to the lunar surface, and this would be large enough for a crewed ascent vehicle. This aligns with NASA's vision for a multi-stage lunar lander that involves both a descent vehicle and then a different spacecraft for humans—the ascent vehicle—that will launch back from the surface of the Moon and return the crew to low lunar orbit. Blue Origin will bid on the descent vehicle portion of NASA's lunar lander contract. A Blue Moon lander with an ascent vehicle (built by another company) on top.

Enlarge /

A Blue Moon lander with an ascent vehicle (built by another company) on top.

Blue Origin

Bezos, who is self-funding Blue Origin at a rate of approximately $1 billion a year, did not say whether he would fund the development of Blue Moon without NASA contracts for cargo delivery to the lunar surface or the descent module contract for the crew lander.

O’Neill cylinders

Throughout his speech, Bezos displayed his enthusiasm for this topic. He was five years old when he watched the Apollo 11 lunar landing. Spaceflight, and the possibilities it offers for humanity, have fascinated him ever since. "You don't choose your passions—your passions choose you," he said Thursday.

During the first part of his talk, Bezos spoke about the world's looming energy crunch. Human energy use grows at a rate of 3 percent a year, he said, and this figure factors in increasing efficiency in computing, transportation, and other sectors. Today, all of humanity's energy needs could be met by a solar farm covering an area the size of Nevada. In a couple of centuries, a solar farm to meet our needs would cover the entire planet.

At some point, unless humans expand into the Solar System, this growing energy demand will meet with finite resources and energy rationing. "That's the path that we would be on, and that path would lead for the first time to your grandchildren having worse lives than you," he said.

  • An artist's rendering of a manufactured environments that could exist in space in the future.

    Blue Origin

  • The design was inspired by Gerard O'Neill, a professor Bezos met when he was a student at Princeton University.

    Blue Origin

  • Bezos said they could have natural environments or cities within.

    Blue Origin

  • Up to 1 million people could live in each habitat.

    Blue Origin

  • Bezos views this as a more viable outcome for human growth than other worlds such as Mars.

    Blue Origin

Other worlds in the Solar System lack Earth's atmosphere and gravity. At most, they could support perhaps a few billion people, Bezos said. The answer is not other planets or moons, he said, but rather artificial worlds or colonies in space known as O'Neill cylinders.

These are named for their creator, Gerard O'Neill, who was a professor at Princeton University where Bezos attended college in the early 1980s. In his book The High Frontier, O'Neill popularized the idea of free-floating, cylindrical space colonies that could have access to ample solar energy. Bezos was hooked then and became president of the campus chapter of Students for the Exploration and Development of Space.

And he is still hooked today, imagining up to 1 million humans living in each cylinder built from asteroid materials and other space resources. Each environment would be climate controlled, with cities, farms, mountains, or beaches. "This is Maui on its best day all year long," Bezos said. "No rain. No earthquakes. People are going to want to live here." And when they need to, they could easily fly back to Earth.

Brave and bold

This is a far-flung future, certainly. Bezos said the challenge for this generation is to build the "road" to space that could, one day, lead to in-space activity that creates such a future.

He cited two "gates" holding back human development of space. One is low-cost access to space, and he noted that his company has already built the reusable New Shepard launch system and will fly the much larger New Glenn rocket into space in 2021.

The second limiting factor is that, to thrive, human activity must rely on resources in space, water from the Moon, metals from asteroids, and energy from the Sun. To this end, Blue Origin has been working on the Blue Moon lander to deliver small rovers and other scientific packages to the Moon. The rovers would suss out information about water ice on the Moon and how it might be harvested for use as rocket fuel. (Conveniently, the Blue Moon lander will use liquid hydrogen).

Bezos said he believes that—if this generation builds the infrastructure needed to enable humans to get into space and develop an economy there—future generations will pick up the ball and run. "People are so creative once they're unleashed," Bezos said.

It was brave and bold of Bezos to put his entire vision for humanity's aspirations out there all at once. Like with Musk in 2016, it opens him up to criticism for being too dreamy about outer space or not caring about the immediate problems here on Earth. But the reality is that, while Earth has plenty of problems today, humanity faces existential concerns in the decades and centuries to come. It is good to think about these problems and plant the seeds for solutions that may one day solve them.

It is all the better when the dreamer proposing them has enough money to get the ball rolling.

Listing image by Mark Wilson/Getty Images



from Hacker News http://bit.ly/YV9WJO
via IFTTT

Increasingly Competitive College Admissions: Much More Than You Wanted to Know

http://bit.ly/2IAj0D2

0: Introduction

This is from businessstudent.com:

Acceptance rates at top colleges have declined by about half over the past decade or so, raising concern about intensifying academic competition. The pressure of getting into a good university may even be leading to suicides at elite high schools.

Some people have dismissed the problem, saying that a misplaced focus on Harvard and Yale ignores that most colleges are easier to get into than ever. For example, from The Atlantic, Is College Really Harder To Get Into Than It Used To Be?:

If schools that were once considered “safeties” now have admissions rates as low as 20 or 30 percent, it appears tougher to get into college every spring. But “beneath the headlines and urban legends,” Jim Hull, senior policy analyst at the National School Board Association’s Center for Public Education, says their 2010 report shows that it was no more difficult for most students to get into college in 2004 than it was in 1992. While the Center plans to update the information in the next few years to reflect the past decade of applicants, students with the same SAT and GPA in the 90’s basically have an equal probability of getting into a similarly selective college today.

Their link to the report doesn’t work, so I can’t tell if this was ever true. But it doesn’t seem true today. From Pew:

The first graph shows that admission rates have decreased at 53% of colleges, and increased at only 31%. The second graph shows that the decreases were mostly at very selective schools, and the increases were mostly at less selective schools. We shouldn’t exaggerate the problem: three-quarters of US students go to non-selective colleges that accept most applicants, and there are more than enough of these for everyone. But if you are aiming for a competitive school – not just Harvard and Yale, but anywhere in the top few hundred institutions – the competition is getting harder.

This matches my impression of “facts on the ground”. In 2002, I was a senior at a California high school in a good neighborhood. Most of the kids in my class wanted to go to famous Ivy League universities, and considered University of California colleges their “safety schools”. The idea of going to Cal State (California’s middle- and lower- tier colleges) felt like some kind of colossal failure. But my mother just retired from teaching at a very similar school, and she says nowadays the same demographic of students would kill to get into a UC school, and many of them can’t even get into Cal States.

The stories I hear about this usually focus on how more people are going to college today than ever, but there’s still only one Harvard, so there’s increasing competition for the same number of spots.

As far as I can tell, this is false.

The college attendance rate is the same today as it was in 2005. If you’ve seen graphs that suggest the opposite, they were probably graphs of the total number of Americans with college degrees, which only proves that more people are getting degrees today than in the 1940s or whenever it was that the oldest generation still alive went to college.

(in fact, since the birth rate is declining, this means the absolute number of college-goers is going down).

I’ll go further. Harvard keeps building more dorms and hiring more professors, so there are the same number of Harvard spots per American today as there were ten years ago, twenty years ago, and all the way back to the 1800s:

I want to look into this further and investigate questions like:

– How did we get to this point? Have college admissions always been a big deal? Did George Washington have to freak out about getting into a good college? What about FDR? If not, why not?

– Is academia really more competitive now than in the past? On what time scale? At what levels of academia? Why is this happening? Will it stop?

– Is freaking out about college admissions the correct course of action?

1. A Harvard-Centric History Of College Admissions

For the first two centuries of American academia, there was no competition to get into college. Harvard admitted…

(Harvard is by far the best-documented college throughout most of this period, so I’ll be focusing on them. No, Ben Casselman, you shut up!)

…Harvard admitted anyone who was fluent in Latin and Greek. The 1642 Harvard admission requirements said:

When any schollar is able to read Tully [Cicero] or such like classicall Latine Authore ex tempore & make and speake true Latin in verse and prose, suo (ut auint) Marte, and decline perfectly the paradigmes of Nounes and Verbes in the Greek tongue, then may hee bee admitted into the Colledge, nor shall any claim admission before such qualifications.

Latin fluency sounds impressive to modern ears, like the sort of test that would limit admission to only the classiest of aristocrat-scholars. But knowledge of classical languages in early Massachussetts was shockingly high, even among barely-literate farmers. In 1647, in between starving and fighting off Indian attacks, the state passed a law that every town of at least 100 families must have a school that taught Latin and Greek (it was called The Old Deluder Satan Law, because Puritans). Even rural families without access to these schools often taught classical languages to their own children. Mary Baker Eddy, who grew up in early 19th-century rural New Hampshire, wrote that:

My father was taught to believe that my brain was too large for my body and so kept me much out of school, but I gained book-knowledge with far less labor than is usually requisite. At ten years of age I was as familiar with Lindley Murray’s Grammar as with the Westminster Catechism; and the latter I had to repeat every Sunday. My favorite studies were natural philosophy, logic, and moral science. From my brother Albert I received lessons in the ancient tongues, Hebrew, Greek, and Latin. My brother studied Hebrew during his college vacations.

By the standards of the time, Harvard admission requirements were tough but fair, and well within the reach of even poorer families. More important, they were only there to make sure students were prepared for the coursework (which was in Latin). They weren’t there to ration out a scarce supply of Harvard spots. In fact this post, summarizing Jerome Karabel’s Chosen, says that “there was no class size limit, because Harvard was trying to compete with Oxford and Cambridge for size”. They wanted as many students as they could get; their only limit was the number of qualified applicants.

These policies continued through the 19th century, with changes only in the specific subjects being tested. In the late 1700s they added some math; in the early 1800s they added some science. You can find a copy of the 1869 Harvard entrance exam here. It’s pretty hard – but it had an 88% pass rate (surely at least in part because you wouldn’t take it unless you were prepared) and everyone who passed was guaranteed a spot at Harvard. Some documents from Tufts around this time suggest their procedure was pretty similar. Some other documents suggest that if you went to a good high school, they assumed you were prepared and let you in without requiring the exams.

When did this happy situation end? Information on this topic is hard to find. I can’t give specific sources, but I get the impression that at the very end of the 19th century, there was a movement to standardize college admissions. At first this just meant make sure every college has the same qualification exams, so that one school isn’t asking about Latin and another about Greek. This culminated in the creation of the College Board in 1899, which administered an admission test that acted as a sort of great-great-grandfather of the SAT. Very gradually, so gradually that nobody at the time really remarked on it, this transitioned from making sure students were ready, to rationing out scarce spots. By about 1920, the transition was basically complete, so that nobody was surprised when people talked about “how colleges should decide who to accept” or questions like that. If you can find more on this transition, please contact me.

Acceptance was originally based entirely on your score on the qualifying exam. But by the 1920s, high-scorers on this exam were disproportionately Jewish. Although Jews were only about 2% of the US population, they were 21% of Harvard’s 1922 class (for more on why this might happen, read my post The Atomic Bomb Considered As Hungarian High School Science Fair Project). In order to arrest this trend, Harvard and other top colleges decided to switch from standardized testing to an easier-to-bias “holistics admissions” system that would let them implement a de facto Jewish quota.

Quota proponents not only denied being anti-Semitic but argued they were actually trying to fight anti-Semitism; if the student body became predominantly Jewish, this might inflame racial tensions against Jews. Harvard president Abbott Lowell, the quotas’ strongest proponent, said: “The anti-Semitic feeling among the students is increasing, and it grows in proportion to the increase in the number of Jews. If their number should become 40% of the student body, the race feeling would become intense”. Was he just trying to rationalize his anti-Semitism? I don’t think so. I doubt modern Harvard officials are anti-Asian in any kind of a hateful sense, but they enforce Asian quotas all the same. What would they say if you asked them why? Maybe that if a country full of whites, blacks, and Latinos had predominantly Asian elite colleges, that might make (as Lowell put it) “the race feeling become intense”. I see no reason to think that 1920s officials were thinking any differently than their modern counterparts.

Whatever the reasons, by the mid-1920s the Jewish quota was in place and Harvard had switched to holistic admissions. But Lowell and his contemporaries emphasized that the new policies were never meant to make Harvard selective. “It is neither feasible nor desirable to raise the standards of the College so high that none but brilliant scholars can enter…the standards ought never to be so high for serious and ambitious students of average intelligence.”

We’ll talk later about how this utopian dream of top-notch education for anyone with a foreskin failed. But before we get there, a more basic question: how come Harvard wasn’t overrun with applicants? If the academic requirements were within reach of most smart high-schoolers, how come there was no need to ration spots?

Below, I discuss a few possibilities in more depth.

1.1: Historical Tuition Fees

Were early American colleges so expensive that everyone except aristocrats was priced out?

No:

(sources: 1, 2, 3, 4, 5, 6)

I find very conflicting accounts of colonial tuition prices. But after the Revolution, tuition stayed stable about about a third average median income until about 1990, when it increased to 1.5x median income. In other words: relative to income, historical tuition costs were about a fifth of what they are today. Some good universities seem to have not had tuition at all – Stanford had a $0 price tag for its first 35 years.

Even when tuition existed, historical accounts suggest it wasn’t especially burdensome for most college students, and record widespread effort to accommodate people who couldn’t pay.The first Harvard scholarship was granted in the 1640s. There are occasional scattered references to people showing up at Harvard without enough money to pay and being given jobs as servants to college officials or other students to help cover costs; in America, Ralph Waldo Emerson took advantage of this kind of program; in Britain, Isaac Newton did.

If you were a poor farmer who couldn’t get a scholarship and didn’t want to work as a servant, sometimes college were willing to accept alternative forms of payment. According to The Billfold:

Harvard tuition — which ran about fifty-five pounds for the four-year course of study — was paid the same way [in barter], most commonly in wheat and malt. The occasional New England father sent his son to Cambridge with parsnips, butter, and, regrettably for all, goat mutton. A 141-pound side of beef covered a year’s tuition.

1.2 Discrimination

Early colleges only admitted white men. Did this reduce the size of the applicant pool enough to give spots to all white men who applied?

I don’t think racial discrimination can explain much of the effect. Throughout the 19th century, America hovered around 85% white. New England, where most Harvard applicants originated, may have been 95% to 99% white – see eg this site which says Boston was 1.3% black in 1865; non-black minorities were probably a rounding error. So there’s not much room for racial discrimination to reduce the applicant pool.

The exclusion of women from colleges in the 1800s is less than generally believed:


(source: unprincipled sketchy attempt to combine this with this to get one measure that covers the entire period)

For every woman in college in 1890, there were about 1.3 men; this is no larger a gender gap than exists today, though in the reverse direction. How come you never hear about this? Many of the women were probably in teacher-training colleges or some other gendered institution; until the early 1900s, none of them were at Harvard. But after gender integration, the women’s colleges were usually annexed to the nearest men’s college, turning them into a single institution. Under these circumstances, it doesn’t seem that likely that integration had a huge effect on admissions selectivity. Also, admitting women can only double the size of the applicant pool, but 1800s college seemed much more than twice as easy to get into.

Overall I don’t think this was a major part of the difference either.

1.3: Lack Of Degree Requirement For Professional Schools

Nowadays college is competitive partly because people expect it to be their ticket to a good job. But in the 19th century, there was little financial benefit to a college degree.

Suppose you wanted to become a doctor. Most medical schools accepted students straight out of secondary school, without a college degree. In fact, most medical schools accepted all “applicants”, the same as Harvard. Like Harvard, there was sometimes a test to make sure you knew Greek and Latin (the important things for doctors!) but after that, you were in.

(This article has some great stories about colonial and antebellum US medical education. Anyone who wanted could open up a medical school; profit-motive incentivized them to accept everybody. Medical-schooling was so profitable that the bottleneck became patients; since there were no regulations requiring medical students to see patients, less scrupulous schools tended to skip this part. Dissection was a big part of the curriculum, but there were no refrigerators, so fresh corpses became a hot commodity. Grave robbing was a real problem, sparking small-scale wars between medical schools and their local towns. “In at least 2 instances, the locals actually raided the school to obtain a body. In 1 case, the school building was destroyed by fire, and in another, 2 people, a student and a professor, were killed.” There were no requirements for how long medical schools should last, so some were as short as nine months. But there were also no requirements for who could call themselves a doctors, so students would sometimes stay until they got bored, then drop out and start practicing anyway. Tuition was about $100 per year, plus cost of living and various hidden fees; by my estimates that’s about half as much (as percent of an average doctor’s salary) as medical school tuition today. This situation continued until the Gilded Age, when medical schools started professionalizing themselves a little more.)

Or suppose you wanted to be a lawyer. The typical method was called “reading law”, which meant you read some law textbooks, served an apprenticeship with a practicing lawyer, and then started calling yourself a lawyer (in some states you also needed a letter from a court testifying to your “good moral character”). Honestly the part where you apprenticed with an practicing lawyer was more like a good idea than a requirement. It’s not completely clear to me that you needed to do anything other than read enough law textbooks to feel comfortable lawyering, and then go lawyer. Most lawyers did not have a college degree.

Abraham Lincoln, a lawyer himself, advised a law student:

If you are absolutely determined to make a lawyer of yourself the thing is more than half done already. It is a small matter whether you read with any one or not. I did not read with any one. Get the books and read and study them in their every feature, and that is the main thing. It is no consequence to be in a large town while you are reading. I read at New Salem, which never had three hundred people in it. The books and your capacity for understanding them are just the same in all places.

Levi Woodbury, the 30th US Supreme Court Justice (appointed 1846), was the first to attend any kind of formal law school. James Byrnes, the 81st Supreme Court Justice (appointed 1941), was the last not to attend law school. It’s apparently still technically possible in four states (including California) to become a lawyer by reading law, but it’s rare and not very encouraged.

The ease of entering these professions helps explain why there was no oversupply of Harvard applicants. But then why wasn’t there an oversupply of doctors and lawyers? We tend to imagine that of course you need strict medical school admissions, because some kind of unspecified catastrophe would happen if any qualified person who wanted could become a doctor. Did these open-door policies create a glut of professionals?

No. There were fewer doctors and lawyers per capita than there are now.

Did it drive down salaries for these professions?

I don’t have great numbers on lawyer salaries, but based on this chart from 1797 Britain and this chart from 1900s America, I get the impression that throughout this period lawyers made about 3-5x as much as unskilled laborers, 3-4x as much as clerks and teachers, and about the same as doctors. This seems to match successful modern lawyers, and probably exceed average modern lawyers. This may because unskilled laborers now earn a minimum wage and teachers have unions, but in any case the 19th-century premium to a law degree seems to have been at least as high and probably higher.

The same seems true of doctor salaries. The paper above estimates physician salaries at $600 per year, during a time when agricultural laborers might have been making $100 and clerks and teachers twice that.

I conclude that letting any qualified person become a doctor or a lawyer, without gatekeeping, did not result in a glut of doctors and lawyers, and did not drive down salaries for those professions beyond levels we would find reasonable today.

1.4: Conclusions

So why weren’t there gluts of would-be college students, doctors, and lawyers? I can’t find any single smoking gun, but here are some possibilities.

Throughout this period, between 60% and 80% of Americans were farmers. Unless you were wealthy or urban, the question of “what career do you want in order to actualize your potential” didn’t come up. You were either going to be a farmer, or else you had some specific non-farm pathway in mind that you could pursue directly instead of getting a college degree to “keep your options open”.

Since rural children were expected to work on the farm, there was no protracted period of educational unproductivity. There was no assumption that your kids weren’t going to be earning anything until age 18 and so you might as well protract their unproductivity until age 22. That meant that paying to send your child to Boston or wherever, and to support him in a big-city lifestyle for four years, was actually a much bigger deal than the tuition itself. This article claims that in 1816, tuition itself was only about 10% of the expenses involved in sending a child to college (granted, poor people pinching pennies could get by for much less than the hypothetical well-off student analyzed here, but I think the principle still holds).

Another limiting factor may have been that there was ample opportunity outside of college and the professions, in almost every area. Twelve US presidents, including George Washington, did not go to college. Benjamin Franklin, everyone’s model of an early American polymath genius, did not go to college. Of the ten richest people in American history (mostly 19th-century industrialists), as far as I can tell only two of them went to college. Aside from the obvious race and gender discrimination, the 19th century was a lot closer to real meritocracy than today’s credentialist fake meritocracy; people responded rationally by ignoring credentials and doing meritorious things.

2. How Did The Zero-Competition Regime Transition To The Clusterf**k We Have Today?

Here is a graph of Harvard admission rates over time, based mostly on these data:

During the early part of the 1900s, Harvard was still in the 19th-century equilibrium of admitting most qualified non-Jewish applicants. Around 1940, the admission rate dropped from 95% to 25%. Most sources I read attribute this to the GI Bill, a well-intentioned piece of legislation that encouraged returning WWII veterans to get a college education. So many vets took the government up on the offer that Harvard was overwhelmed for the first time in its history.

But this isn’t the whole story.

You’ve seen this before – this is percent of Americans (by gender) to graduate college. It’s sorted by birth cohort, which means 1920 on the x-axis corresponds to the people who were in college in the 1940s – eg our GIs. The GI Bill is visible on this graph – around 1920, there is a spike in attendance for men but not women, which is the pattern we would predict from GIs. But it only takes college graduation rate from 10% to 15% (compared to its current 40%). And after the GI Bill, the college graduation rate starts dropping again – as we would expect of a one-time shock from a one-time war. And between 1955 and 1960, Harvard admissions rebound to about 40% of applicants.

The big spike in college attendance rates – and a corresponding dip in Harvard admission percentage – takes place in the 1938 to 1952 birth cohort. Why are all these people suddenly going to college? They’re dodging the draft. A big part of the increase in college admissions was people taking advantage of the college loophole to escape getting sent to Vietnam.

Again, this is a one-time shock, and mostly applies to men. So how come we see a quadrupling of college graduation during this period affecting men and women alike?

A standard narrative says that work has gotten more difficult over the past century, and so workers need more education. I’ve always found this hard to believe. In other countries, students still go to medical school and law school without a separate college degree first. Programming is a classic example of a high-skilled complicated modern profession, but many programmers dropped out of college, many others didn’t attend at all, and many programming “boot camps” are opening up offering to teach programming skills outside the context of a college education. And in many of the jobs that do require college education, the education is irrelevant to their work. Both of my adult jobs – as an English teacher and as a doctor – required me to have a college degree in order to apply. But my college education was relevant to neither (I’m a philosophy major). The degree requirement seemed like more of a class barrier / signaling mechanism than an assertion that only people who knew philosophy could make good teachers and doctors. I realize I’m making a strong claim here, and I don’t have space to justify it fully – for more on this, read my Against Tulip Subsidies and SSC Gives A Graduation Speech – or better yet, Bryan Caplan’s The Case Against Education.

If increasing need for skills didn’t cause increasing college attendance, what did? Again, this is based off of idiosyncratic beliefs I don’t have the space to justify (again, read Caplan) but it could be a sort of self-reinforcing signaling cycle. Once the number of people in college reached a certain level, it led to a well-known social expectation that intelligent and conscientious men would have college degrees, which made college a sign of intelligence and, conversely, not having been to college a sign of stupidity. If only 10% of smart/hard-working people have been to college, not having a college degree doesn’t mean someone isn’t smart/hard-working; if 90% of smart/hard-working people have been to college, not having a college degree might call their intelligence and work ethic into question. This cycle meant that after the shocks of the mid-1900s, there was a strong expectation of a degree in the knowledge professions, which forced women and later generations of men to continue going to college to keep up. The government’s decision to provide an endless stream of supposedly-free college loans exacerbated the problem and sabotaged the only natural roadblock that could have stopped it.

At the same time, several factors were coming together to discourage hunch-based “I like the cut of his jib” style hiring practices. Community ties were becoming weaker, so hirers typically wouldn’t have social contacts with potential hirees. Family businesses whose owners could hire based on hunches were giving way to large corporations where interviewers would have to justify their hiring decisions to higher-ups. Increasing concern about racism was raising awareness that hunch-based hiring tended to discriminate against minorities, and the advent of the discrimination lawsuit encouraged hiring based on objective criteria so you could prove you rejected someone for reasons other than race. The Supreme Court decision Griggs v. Duke may or may not have played a role by making it legally risky for corporations to give prospective hires aptitude tests. All of this created a “perfect storm” where employers needed some kind of objective criteria to evaluate potential new hires, and all the old criteria weren’t cutting it anymore. The rise of the college degree as a signal for intelligence, and the increased sorting of people by college selectivity, fit into this space perfectly.

Once society established that knowledge-worker jobs needed college degrees, the simultaneous rises in automation, globalization, and inequality made knowledge-worker jobs increasingly necessary to earn a living, completed the process.

If my story were true, this would suggest college attendance would not have risen so quickly in other countries that didn’t have these specific factors. I don’t have great cross-country data, but here’s what I can find:

College attendance in the UK supposedly remained very low until a 1992 act designed to encourage it, but it looks like part of that is just them reclassifying some other schools as colleges. I don’t know how it really compared to the US and I welcome information from British readers who know more than I do about this. Through the rest of the world, college attendance lagged North America by a long time, but the continent-wide categories probably combine countries at different levels of economic development. I don’t really know about this one.

Moving on: the graphs in the Introduction show that college attendance has been stable since about 2005. Why did the rise stop? These articles point out a few relevant trends.

First, the economy is usually to blame for this kind of thing. There was a slight increase in attendance during the 2008 recession, and a slight decrease during the recent boom. But over the course of the cycle, it still seems like the increase in college attendance has slowed or stopped overall, in a way that wasn’t true of past business cycles.

Second, birth rates are decreasing, which means fewer college-aged kids. The national population is still increasing, mostly because of immigrants, but many immigrants are adults without much past education, so they’re not as significant a contribution to the college population.

Third, the price of college keeps going up. I’m surprised to hear this as a contribution to declining attendance, because I thought it was the glut of students that kept prices high, but maybe both factors affect each other.

Fourth, for-profit colleges are falling apart.

In some cases, the government has shut them down for being outright scams. In other cases, potential students have wised up, realized they are outright scams, and stopped being interested in attending them. These colleges advertised to (some would say “preyed on”) people who weren’t able to get into other colleges, so their collapse looks like a fall in the college enrollment/graduation rate.

These are all potentially relevant, but they seem kind of weak to me: the sort of thing that explains the year-to-year trend, but not why the great secular movement in favor of more college has stopped.

Maybe it’s just reached a natural ceiling. Seventy percent of high school graduates are now going to college. The remaining 30% may disproportionately include people with serious socioeconomic or health problems that make going to college very hard for them.

Also, keep in mind that only about 60% of college students graduate in anywhere near the expected amount of time. Some economists have come up with rational-college-avoidance models where people who don’t expect to be able to graduate from college don’t waste their money trying.

3. If Number Of Students Applying To College Has Been Constant Or Declining Over The Past Ten Years, Why Are Admissions To Top Colleges So Much More Competitive?

To review: over the past ten years, the number of US students applying to college has gone down (the number applying to four-year private colleges has stayed about the same). But Harvard’s acceptance rates have decreased by half, with similar cuts across other top schools, and more modest cuts across most good and moderately-good colleges. There’s also a perception of much greater pressure on students to have perfect academic records before applying. Why?

3.1: Could the issue be increasing number of international students?

This would neatly match the evidence of constant US numbers vs. increasing selectivity.

Harvard equivocates between a few different definitions of “international student”, but I think it’s comparing apples to apples when it says the Class of 2013 was 10% foreign citizens and the Class of 2022 is 12%. These two classes bound the time period we’re worrying about, and this doesn’t seem like a big change. Also, across all US colleges international student enrollments seem to be dropping, not increasing. Some of this may have to do with strict Trump administration visa policies, or with international perceptions of increasing US hostility to foreigners.

Since fewer international students are applying in general, and even top schools show only a trivial increase, this probably isn’t it.

3.2: Could the issue be more race-conscious admission policies?

Might top colleges be intensifying affirmative action and their preference for minorities and the poor, thus making things harder for the sort of upper-class white people who write news articles about the state of college admissions? Conversely, might colleges by relaxing their restrictions on high-achieving Asians, with the same result?

This matches the rhetoric colleges have been putting out lately, but there is not a lot of signs it’s really happening. Harvard obsessively chronicles the race of its student body, and the class of 2010 and class of 2022 have the same racial composition. The New York Times finds that whites are actually better represented at colleges (compared to their percent of the US population) than they were 35 years ago, although Asians are the real winners.

The Times doesn’t explain why this is happening. It may be due to weakening affirmative action, including bans by several states. Or it may be because of a large influx of uneducated Mexican immigrants who will need a few more generations of assimilation before their families attend college at the same rate as whites or previous generations of Latinos.

What about Asians? There was a large increase in Asian admissions, but it was mostly before this period. The Ivy League probably has some kind of unofficial Asian quota which has been pretty stable over the past decade. Although the Asian population continues to grow, and their academic achievement continues to increase, this probably just increases intra-Asian competition rather than affecting people of other races.

3.3: Could the issue be increasing number of applications per student?

Here’s an interesting fact – even though no more Americans or foreigners are applying to colleges today vs. ten years ago, Harvard is receiving twice as many applications – from about 20,000 to more than 40,000. How can this be?

The average college student is sending out many more applications.

I am not Harvard material. But when I was looking at colleges, my mother pressured me to apply to Harvard. “Come on!” she said. “It will just take a few hours! And who knows? They might accept you! You’ll never get in if you don’t try!”

Harvard did not accept me. But my mother’s strategy is growing in popularity. Part of this might be genuine egalitarianism. Maybe something has gone very right, and the average American really does believe he or she has a shot at the Ivy League. But part of it may also be a cynical ploy by colleges to improve their rankings in US News and other similar college guides. These rankings are partly based on how “selective” they are, ie what percent of students they turn away. If they encourage unqualified candidates to apply, they can turn those unqualified candidates away, and then they appear more “selective” and their ranking goes up.

But increased application volume is mostly driven by an increasingly streamlined college admissions process, including the Common Application. I didn’t like my mother’s advice, because every college application I sent in required filling in new forms, telling them my whole life story all over again, and organizing all of it into another manila envelope with enclosed check. It was like paying taxes, except with essay questions. And there was a good chance you’d have to do it all over again for each institution you wanted to apply for. Now that’s all gone. 800 schools accept the Common Application, including the whole Ivy League. From the Times again:

Six college applications once seemed like a lot. Submitting eight was a mark of great ambition. For a growing number of increasingly anxious high school seniors, figures like that now sound like just a starting point…

For members of the class of 2015 who are looking at more competitive colleges, their overtaxed counselors say, 10 applications is now commonplace; 20 is taking on a familiar ring; even 30 is not beyond imagining. And why stop there? Brandon Kosatka, director of student services at the Thomas Jefferson School for Science and Technology in Alexandria, Va., recently worked with a student who wanted a spot in a music conservatory program. To find it, she applied to 56 colleges. A spokeswoman for Naviance, an online tool that many high school students and their counselors use to keep track of applications, said one current user’s “colleges I’m applying to” tab already included 60 institutions. Last year the record was 86, she said.

Does this mean increasing competitiveness is entirely an illusion? Suppose in the old days, each top student would apply to either Harvard or Yale. Now each top student applies to both Harvard and Yale, meaning that both colleges get twice as many applicants. Since each of them can only admit the same number of students, it looks like their application rate has been cut in half. But neither one has really become more competitive!

This can’t quite be it. After all, in the first case, Yale would expect 100% of accepted students to attend. In the second, Yale would know that about 50% of accepted students would choose Harvard instead, so it would have to accept twice as many students, and the acceptance rate per application wouldn’t change.

But if more people are following my mother’s strategy of applying to Harvard “just in case” even when you’re not Harvard material, then this could be an important factor. If the number of people who aren’t Harvard material but have mothers who imagine they are is twice as high as the number of people who are really Harvard material, then Harvard admissions will triple. If Harvard accepts these people, they will definitely go to Harvard, so there is no need for Harvard to increase its admission rate to compensate. Here there really is an illusion of increasing competition.

Finally, this process could increase sorting. Suppose that, for the first time in history, a Jewish mother had an accurate assessment of her son’s intellectual abilities, I really was Harvard material, and I was unfairly selling myself short. If the existence of a Common Application lets more people apply to Harvard “just in case”, and if the Harvard admissions committee is good at their job, then the best students will get more efficiently matched with the best institutions. In the past, Harvard might have been losing a lot of qualified applicants to unjustified pessimism; now all those people will apply and the competition will heat up.

And in the past, I think a lot of people, including really smart people, just went to the nearest halfway-decent state college to their house. Partly this was out of humility. Partly it was because people cared about family and community more. And partly it was because college wasn’t viewed as the be-all and end-all of your value as a human being and you had to get into the Ivy League or else your life was over. If all these people are now trying to get into Harvard, that will increase competition too.

Can we measure this?

This is the best I can do. It shows that over the past ten years, the number of students at public universities who come from in-state has dropped by 5%. This is probably related to sorting – people working on sorting themselves efficiently will go to the best school they can get into rather than just the closest one in their state. But it’s not a very dramatic difference. I suspect, though I can’t prove, that this is hiding a larger change at the very top of the distribution.

3.4: Could the issue be that students are just trying harder?

Imagine the exact same students applying to the exact same schools. But in 2009, they take it easy and start studying for their SATs the night before, and in 2019, they all have private tutors and are doing five extracurricular activities. College admissions will seem more competitive in 2019.

Any attempt to measure this will be confounded by reverse causation – increased effort might or might not cause increased selectivity, but increased selectivity will definitely cause increased effort. I’m not sure how to deal with this.

If studying harder improves SAT scores, these could be a proxy for how much effort students putting in. They changed the test in 2016 in a way that makes scores hard to compare, but we can at least compare scores from earlier years. Scores decline between 2005 and 2015 in both math and reading. This may be because more students are taking the SAT (1.5 million in 2008 vs. 2.1 million in 2018) so test-takers are a less selected population. This is kind of surprising given that college enrollment is stable or declining, but it could be that as part of pro-equality measures, schools are pressuring more low-achieving kids to take the SATs in order to “have a chance at college”, but those students don’t really end up attending. In support of this theory, scores are declining most quickly among blacks, Hispanics, and other poorer minority groups who may not have taken the SAT in earlier years; they are stable among whites, and increasing among Asians (increasing numbers of whom may be high-achieving Chinese immigrants). At least, this is the best guess I can come up with for why this pattern is happening. But it means SATs are useless as a measure of whether students are “trying harder”.

Why might students be trying harder? If there’s a ten year lag between things happening and common knowledge that the things have happened, the explosion of college attendance during the 1990s, with an ensuing increase in competitiveness, might have finally percolated down to the average student in the form of advice that getting into college is very hard and they should work to be more competitive. In addition, the Internet is exposing new generations of neurotic parents to messages that unless their child is perfect they will never get into college and probably die alone in a ditch.

Further, the decline of traditional criteria might be causing an increasing emphasis on extracurriculars, which take a harder toll on college students. Because of grade inflation, colleges are no longer counting high school grades as much as they used to; because meritocracy is passé, they’re no longer paying as much attention to the SAT. This implies increased emphasis on extracurriculars – things like student government, clubs, internships, charitable work, and the like. Despite popular misconceptions, the SAT is basically an IQ test, and doesn’t really reward obsessive freaking out and throwing money at the problem. But getting the right set of extracurriculars absolutely rewards obsessively freaking out and throwing money at the problem. Maybe twenty years ago, you just played the IQ lottery and hoped for the best, whereas now you work yourself ragged trying to become Vice-President of the Junior Strivers Club.

But all of this is just speculation; I really don’t know how to get good data on these subjects.

3.5: Are funding cuts reducing the number of college spots available?

Some people argue that cuts in public education are reducing the number of positions available at public universities, meaning the same number of students are competing for fewer spots. This source confirms large cuts in public funding:

These universities have tried to compensate by increasing tuition (or increasing the percent out-of-state students, who pay higher tuition). It looks like they’ve done this on a pretty much one-to-one basis, so that they’re actually getting more money per student now than they did when public funding was higher.

And from California:

It’s not clear that declining state support affected enrollment at all. Colleges just raised their prices by a lot.

In 2007, 2.8x as many students were in public universities compared to private ones. In 2017, the ratio was 2.9. If the problem were limited availability of public universities to absorb students, we might expect the percent of students at public universities to go down. This doesn’t seem to be happening.

Overall it doesn’t look like funding cuts to public universities mattered very much here.

3.6: Conclusions?

The clearest reason for increasing academic competition in the past ten years is the increasing number of applications per person, enabled by the online Common Application. This has doubled the number of applications sent to top colleges like Harvard despite the applicant pool staying the same size. Some of this apparent increased competition is a statistical illusion, but parts of it may be real due to increased sorting.

Other reasons may include increased common knowledge of intense competition making everyone compete more intensely, and decreased use of hard-to-game metrics like the SAT in favor of easy-to-game metrics like extracurriculars.

4. What Has Been Happening Beyond The College Level?

Competition is intensifying.

Between 2006 and 2016, the number of applicants to US medical schools increased by 35% (note change in number of applicants, not number of applications).

In a different statistic covering different years, the number of people enrolled at medical school increased 28% from 2002 to 2017. These two numbers aren’t directly comparable, but by eyeballing them we get the impression that the number of spots is increasing more slowly than the number of applicants, probably much more slowly.

As predicted, the MCAT (the med school version of the SAT) scores necessary for admission have been increasing over time.

This is also the impression I have been getting from doctors I know who work in the medical school and residency admissions process. I got to interview some aspiring residents a few years ago for a not-even-all-that-impressive program, and they were fricking terrifying.

Law schools keep great data on this (thanks, law schools!). US News just tells us outright that law schools are less competitive than in 2008, even at good programs. Here’s the graph:

And despite it feeling like lawyers are everywhere these days, law school attendance has really only grown at the same rate as the population since 1970 or so, and dropped over the past decade. This may be relating to word getting out that lawyer is no longer as lucrative a career as it used to be.

Unlike law schools, graduate school basically fails to keep any statistics whatsoever, and anything that might be happening at the graduate level is a total mystery. We know the number of PhDs granted:

…and that’s about it.

Part of what inspired me to write this post was listening to a famous scientist (can’t remember who) opine that back when he was a student in the 1940s, he kind of wandered into science, found a good position at a good lab, worked up the ranks to become a lab director, and ended up making great discoveries. He noted that this was unthinkable today – you have to be super-passionate to get into science grad school, and once you’re in you have to churn out grant proposals and be the best of the best to have any shot at one day having a lab of your own. I’ve heard many people say things like this, but I can’t find the evidence that would put it into perspective. If anyone knows more about the history of postgraduate education and work in the sciences, please let me know.

I’m also interested in this because it would further help explain undergraduate competition. If more people were gunning for med school and grad school, it would be more important to get into a top college in order to have a good chance of making it in. Since increasing inequality and returns to education have made advanced-degree jobs more valuable relative to bachelors-only jobs, this could explain another fraction of academic competitiveness. But aside from the medical school data, I can’t find evidence that this is really going on.

5. Is Freaking Out Over College Admissions Correct?

Dale and Krueger(2011) examine this question, using lifetime earnings as a dependent variable.

In general, they find no advantage from attending more selective colleges. Although Harvard students earn much more than University of Podunk students, this is entirely explained by Harvard only accepting the highest-ability people. Conditional on a given level of ability, people do not earn more money by going to more selective colleges.

A subgroup analysis did find that people who started out disadvantaged did gain from going to a selective college, even adjusted for pre-existing ability. Blacks, Latinos, and people from uneducated families all gained from selective college admission. The paper doesn’t speculate on why. One argument I’ve heard is that colleges, in addition to providing book-learning, help induct people into the upper class by teaching upper-class norms, speech patterns, etc, as well as by ensuring people will have an upper-class friend network. This may be irrelevant if you’re already in the upper class, but useful if you aren’t.

A second possibility might be that college degrees are a signal that help people overcome statistical discrimination. Studies have shown that requiring applicants share drug test results or criminal histories usually increases black applicants’ chances of getting hired. This is probably because biased employers assume the worst about blacks (that they’re all criminal drug addicts), and so letting black applicants prove that they’re not criminal drug addicts puts them on more equal footing with white/Asian people. In the same way, if employers start with an assumption of white/Asian competence and black/Latino incompetence, selective college attendance might not change their view of whites/Asians, but might represent a major update to their view of blacks/Latinos.

Dale and Krueger also find that the value of college did not increase during the period of their study (from 1976 to 1989).

Does this mean that at least whites and Asians can stop stressing out about what colleges they get into?

What if you want to go to medical or law school? I can’t find an equally rigorous study, but sites advising prospective doctors tell them that the college they went to matters less than you’d think. The same seems true for aspiring lawyers. As usual, there is no good data for graduate schools.

What if you want to be well-connected and important?

From here, the percent of members of Congress who went to Ivy League colleges over time, by party:

Only about 8% of Congresspeople went to Ivy League colleges, which feels shockingly low considering how elite they are in other ways. The trend is going up among Democrats but not Republicans. There is obviously a 40-50 year delay here and it will be a long time before we know how likely today’s college students are to get elected to Congress. But overall this looks encouraging.

On the other hand, presidents and Supreme Court Justices are overwhelmingly Ivy. Each of the last five presidents went to an Ivy League school (Clinton went to Georgetown for undergrad, but did his law degree at Yale). Every current Supreme Court justice except Clarence Thomas went to an Ivy for undergrad, and all of them including Thomas went to an Ivy for law school. But there’s no good way to control for whether this is because of pre-existing ability or because the schools helped them succeed.

Tech entrepreneurs generally went to excellent colleges. But here we do have a hint that this was just pre-existing ability: many of them dropped out, suggesting that neither the coursework nor the signaling value of a degree was very important to them. Bill Gates, Mark Zuckerberg, and Larry Ellison all dropped out of top schools; Elon Musk finished his undergrad, but dropped out of a Stanford PhD program after two days. This suggests that successful tech entrepreneurs come from the population of people smart enough to get into a good college, but don’t necessarily benefit from the college itself.

Overall, unless people come from a disadvantaged background, there’s surprisingly little evidence that going to a good college as an undergraduate is helpful in the long term – except possibly for a few positions like President or Supreme Court justice.

This doesn’t rule out that it’s important to go to a good institution for graduate school; see this paper. In many fields, a prestigious graduate school is almost an absolute requirement for becoming a professor. But there doesn’t seem to be an undergrad equivalent of this.

Digression: UC schools

I mentioned at the beginning the universal perception in California that UCs are much harder to get into. I know this is the perception everywhere, but it seems much worse in California. Sure, it’s anecdotal evidence, but the anecdotes all sound like this:

My friend’s daughter got 3.85 GPA, had 5 AP classes in high school, was on competitive swimming team, volunteered 100+ hours, was active in school activities, yet she got rejected by all 4 UCs that she applied to. And these were not even the highest tier of UCs, not Berkeley. She did not apply for more schools and thought that UC San Diego and UC Santa Cruz were her safe choices. The whole family is devastated.

The data seem to back this up. Dashed line is applications, dotted line is admissions, solid line is enrollments:

…but I don’t know how much of this is just more applications per person, like everywhere else.

Why should UC schools be hit especially hard? I assumed California’s population was growing faster than the rest of the country’s, but this doesn’t seem true: both California and the US as a whole grew 13% between 1990 and 2000, when the cohort attending college between 2008 and 2018 would have been born.

The Atlantic points out that, because of budget cuts, UC schools are admitting more out-of-state students (who have to pay higher tuition), lowering the number of spots available to Californians. But is this really that big an effect?

It looks like nonresidents went from 6% to 12% over the space of a decade. That shouldn’t screw things up so badly.

I’m really not sure about this. One possibility is that California’s schools are remarkably good. On money.com’s list of best colleges, four of the top ten schools are UCs, plus you get to live in California instead of freezing to death in New England. Since the college admissions crisis is concentrated at the top schools, California has been hit especially hard.

I’m not satisfied with this explanation; let me know if you know more.

6. Conclusions

1. There is strong evidence for more competition for places at top colleges now than 10, 50, or 100 years ago. There is medium evidence that this is also true for upper-to-medium-tier colleges. It is still easy to get into medium-to-lower-tier colleges.

2. Until 1900, there was no competition for top colleges, medical schools, or law schools. A secular trend towards increasing admissions (increasing wealth + demand for skills?) plus two shocks from the GI Bill and the Vietnam draft led to a glut of applicants that overwhelmed schools and forced them to begin selecting applicants.

3. Changes up until ten years ago were because of a growing applicant pool, after which the applicant pool (both domestic and international) stopped growing and started shrinking. Increased competition since ten years ago does not involve applicant pool size.

4. Changes after ten years ago are less clear, but the most important factor is probably the ease of applying to more colleges. This causes an increase in applications-per-admission which is mostly illusory. However, part of it may be real if it means students are stratifying themselves by ability more effectively. There might also be increased competition just because students got themselves stuck in a high-competition equilibrium (ie an arms race), but in the absence of data this is just speculation.

5. Medical schools are getting harder to get into, but law schools are getting easier to get into. There is no good data for graduate schools.

6. All the hand-wringing about getting into good colleges is probably a waste of time, unless you are from a disadvantaged background. For most people, admission to a more selective college does not translate into a more lucrative career or a higher chance of admission to postgraduate education. There may be isolated exceptions at the very top, like for Supreme Court justices.

I became interested in this topic partly because there’s a widespread feeling, across the political spectrum, that everything is getting worse. I previously investigated one facet of this – that necessities are getting more expensive – and found it to be true. Another facet is the idea that everything is more competitive and harder to get into. My parents’ generation tells stories of slacking off in high school, not worrying about it too much, and knowing they’d get into a good college anyway. Millennials tell stories of an awful dog-eat-dog world where you can have perfect grades and SAT scores and hundreds of hours of extracurriculars and still get rejected from everywhere you dreamed of.

I don’t really have a strong conclusion here. At least until ten years ago, colleges were harder to get into because more people were able to (or felt pressured to) go to college. The past ten years are more complicated, but might be because of increased stratification by ability. Is that good or bad? I’m not sure. I still don’t feel like I have a great sense of what, if anything, went wrong, whether our parents’ rose-colored picture was accurate, or whether there’s anything short of reversing all progress towards egalitarianism that could take us back. I’m interested to get comments from people who understand this area better than I do.



from Hacker News http://bit.ly/YV9WJO
via IFTTT

Anti-Aging Discovery Could Lead to Restorative Skin Treatments

https://ift.tt/2UgDgRl


Despite a multi-billion-dollar skin care industry and plenty of marketing claims, nothing exists that can prevent our skin from turning into tissue paper as we age—except, perhaps, religiously wearing sunscreen. Accumulated damage from UV radiation and other age-related stressors drains the skin’s pool of renewal cells—or stem cells—and there is no way to stop or slow this process.

But hope for skincare junkies is on the horizon. A study published April 3 in Nature provides new insight into how stem cell loss occurs and even identifies two chemicals that may be able to prevent it.

The research, led by Emi Nishimura, a professor of stem cell biology at Tokyo Medical and Dental University in Japan, revealed that aging and UV exposure deplete stem cells of a crucial collagen protein. Skin aficionados may recognize collagen as a key player in maintaining strong, youthful, elastic skin. The weakened stem cells no longer divide normally, and are ultimately forced to turn into adult skin cells. Over time, so many stem cells become damaged that there aren’t enough healthy ones to replace them.

“I think it’s a beautiful study,” says David Fisher, a professor of dermatology at Harvard Medical School who was not involved in the research. “I think it’s a very elegant analysis, but also it has some very practical mechanistic insights into how this is happening, and even potentially actionable ones to promote youthfulness.”

Our skin is divided into two sections: the epidermis on top and the dermis underneath. The epidermis is what we conventionally think of as our skin and is made up of many layers of cells, while the dermis consists of connective tissue, hair follicles, blood vessels, and sweat glands.

As part of normal skin health, the top layer of the epidermis is constantly being sloughed off and replaced from a self-replenishing pool of stem cells that hangs out on the bottom (or basal) layer. These stem cells have roots that anchor them to a thin piece of tissue called the basement membrane that connects the epidermis and the dermis. The tether to the basement membrane is essential for maintaining a cell’s “steminess"—its ability to replicate and mature into another type of cell.

Most of the time, the stem cells in the epidermis divide horizontally, cloning themselves and adding to the renewal pool. Sometimes, though, they divide vertically, and the new cell starts to mature into an adult skin cell, which is gradually pushed up through the layers of the epidermis.

This type of cell turnover—replacing older cells at the top of the epidermis with younger cells from the bottom—explains how cuts heal and skin stays young looking. As people age, however, the pool of stem cells becomes depleted and cell turnover slows, eventually leaving people with thin, fragile skin.

“The ultimate question, which [the study is] trying to address, is why are there fewer cells? Why do we lose stem cells as we get older?” says Terry Lechler, an associate professor of dermatology at Duke University who was not involved in the research. “I think that's the real crux and the really interesting question.”

The study suggests that the stem cells that divide vertically do so because they are damaged through regular aging and the normal cell turnover process, as well as exposure to UV light or other types of toxins. And not only does the new adult cell start its journey through the epidermis, the original stem cell also gets pushed off of the basal layer, forcing it to mature. This is because the damaged stem cell’s roots have become weakened, so it can no longer sufficiently anchor to the basement membrane. The researchers describe this step as a kind of competition, the neighboring healthy stem cells banding together and forcing the weak stem cell off of the island.

“It appears that this is due to a quality-control mechanism whereby a skin stem cell that gets damaged is basically purged from the skin,” says James DeGregori, a professor of biochemistry at the University of Colorado Denver who wrote a commentary article to accompany the paper. “You could almost imagine all of these stem cells are kind of jostling for position, and if you're really gripping that basement membrane, you're going to do better.”

At first this competition is beneficial, ridding the skin of malfunctioning cells or even cancer-causing mutations. However, at a certain point too many stem cells become damaged and they begin to outnumber the healthy ones. When this happens, the skin can no longer effectively rejuvenate itself or respond to injury. “Stem cell competition between epidermal stem cells sustains skin youthfulness, but the decline of the competition ends up with skin aging,” Nishimura explains.

The linchpin in this process is collagen 17, a specific type of collagen protein that is critical for rooting the stem cell to the basement membrane. As stem cells become damaged, they lose precious amounts of collagen 17. The more protein they lose, the weaker their bond to the basement membrane, until eventually they are forced out by neighboring healthy cells.

The good news is that there may be a way to increase or preserve levels of collagen 17 in stem cells, staving off this process of skin aging. Nishimura showed that two experimental chemicals, Y27632 and apocynin, applied topically can increase collagen 17 levels in cells and even promote wound healing.

This does not mean you should purchase the next skin care product you see that has “collagen” or “stem cells” on the label—there is no evidence that anything on the market affects this pathway. But it does suggest a scientifically-backed rejuvenating cream could be on the horizon.



from Scientific American https://ift.tt/n8vNiX
via IFTTT

Marketing the myth of serotonin, the ‘happy chemical’ (2015)

https://ift.tt/2UTPkEh


If serotonin is the "happy chemical," then boosting our serotonin levels should keep depression at bay. After all, low serotonin brings on the blues, right?

But the truth is, depression is not a serotonin deficiency. The idea that depression is caused by low serotonin levels is based on flimsy evidence dating to the 1950s. Pharmaceutical companies promoted the low serotonin story to sell Prozac and related antidepressants. They marketed a myth.

Today, the serotonin fallacy is as ingrained as the notion that drinking orange juice wards off a cold. Many of us still believe we are raising our serotonin levels to lift depression using wildly popular drugs known as selective serotonin re-uptake inhibitors, or SSRIS. But psychiatrists now say it is unlikely these drugs treat depression by simply increasing serotonin. While antidepressants help many patients, researchers have only a hazy idea of how they work.

Story continues below advertisement

The consensus is that depression is a complex disorder with hundreds of potential underlying causes, said Dr. Roger McIntyre, head of the mood disorders psychopharmacology unit at the University Health Network in Toronto. "There's really no scientific case to say that people who have depression have a deficiency in body and brain serotonin levels."

The medical journal BMJ put the spotlight on the low serotonin doctrine in a recent editorial published in April and written by Dr. David Healy, a professor of psychiatry at Bangor University in Wales.

Blockbuster sales of antidepressants such as Prozac are based on the marketing of the serotonin myth, Healy wrote. He added that pharmaceutical companies misled the public into putting too much faith in SSRIs.

Scientists never confirmed whether SSRIs raise or lower serotonin levels. "They still don't know," Healy said.

Many of his peers suggest that Healy is not a respected figure in psychiatry, in part because of his stance that older tricyclic antidepressants are better than today's Prozac-type drugs. His colleagues maintain that SSRIs are safer if taken in overdose than older antidepressants, and caution that patients should not switch medications based on Healy's views.

Nevertheless, most psychiatrists agree that depression is not a matter of serotonin levels being up or down. The role of serotonin in depression is best described as a "dysregulation" of the serotonin system, McIntyre said.

The serotonin system regulates aspects of behaviour, thought processes and mood. But it also interacts with other brain systems that may be involved in depression.

Story continues below advertisement

Modern antidepressants block the re-absorption of serotonin in the brain. When researchers discovered that SSRIs helped depression in some patients, they concluded that low serotonin must be the cause of the disorder. But the assumption was no more valid than the notion that "having a headache means that your Tylenol levels are low," McIntyre said.

McIntyre described psychiatrists as being "guilty of exuberance" when they framed depression as a low serotonin problem.

But others, including Healy, point fingers at the pharmaceutical industry. Starting in the late 1980s, Prozac and other "miracle pills" promised to treat serotonin deficiency as easily as vitamin C cures scurvy.

Marketing materials for the new antidepressants presented the unproven low serotonin hypothesis as fact, according to a 2005 study in the journal PLOS One.

For example, on a branded website for an antidepressant called Celexa, the FAQ section claimed that "Celexa helps to restore the brain's chemical balance by increasing the supply of a chemical messenger in the brain called serotonin."

Doctors adopted the low serotonin story as "an easy shorthand" for explaining depression to their patients, Healy wrote. Patients were receptive to SSRIs because the idea of correcting a chemical imbalance reduced the stigma of mental illness.

Story continues below advertisement

Treating depression with Prozac was reportedly a cinch. "You can teach a chimpanzee to prescribe it," a psychopharmacologist was quoted as saying in a 1989 New York magazine cover story entitled Bye-Bye Blues: A New Wonder Drug for Depression.

But the hype did not prepare patients for side effects ranging from nausea and anxiety to suicidal thoughts – or the strong chance that SSRIs might not relieve their suffering.

Two out of three patients with depression either do not respond to SSRIs, or get only partial relief on SSRIs and other antidepressants, according to findings from STAR*D, the largest clinical trial study of depression medications. Completed in 2006, the trial was funded by the U.S. National Institute of Mental Health.

When SSRIs work, improvements in depression may be due to direct action on the serotonin system, or downstream effects of serotonin on other systems.

For example, severe depression is a side effect of interferon, a drug prescribed to spur the immune system in patients with skin cancer and Hepatitis C. Studies have shown that preventive treatment with SSRIs greatly reduces the risk of depression in patients on interferon. Based on these findings, "clearly the serotonin drugs have indirect effects on the immune-inflammatory system," McIntyre said.

The fact remains that not all patients benefit from taking SSRIs. Thus, the logical conclusion is that the right treatment depends on the individual cause of the patient's depression, McIntyre said. Depression is an illness of the brain – "by far the most complicated organ in the human body," he pointed out.

Story continues below advertisement

Treating depression may never be as simple as prescribing insulin for diabetes or ibuprofen for fever. And despite what pharmaceutical marketing may say, no single pill can cure this ill.

**

Antidepressant marketing

The marketing of antidepressants is worth a second look, since history tends to repeat itself.

Right now, the top-selling drug in the United States ranked by sales is Abilify, a powerful antipsychotic. Sales took off in 2007 when the drug was approved as an add-on treatment for depression in patients who do not get adequate relief from taking an antidepressant alone. (Canada approved this use in 2013.) By March, 2014, annual sales of Abilify topped $6-billion.

U.S. advertisements for Abilify say the drug works "like a thermostat to restore balance." But the drug's product insert acknowledges that the mechanism of action "is unknown." As with antidepressants, researchers have no idea how Abilify works.

Story continues below advertisement

In fact, psychiatrists are just beginning to understand what causes depression and why medications work for some patients, and not others.

Some researchers hypothesize that depression is due to a dysfunctional glutamate system. Disruptions in this fast-acting signalling system may alter neuroplasticity, the brain's ability to change and adapt in response to stress and other events. Reduced plasticity may lead to depression.

A second theory, known as the metabolic hypothesis, suggests that insulin, like glutamate, facilitates brain plasticity. Depression may occur when brain regions do not make enough insulin or cannot respond to insulin properly.

Other researchers are testing the immune-inflammatory hypothesis – the idea that depression is caused by inflammation due to an overworked immune system.

Another line of thinking, known as the bio-energetic hypothesis, suggests that tiny powerhouses within brain cells may not be producing or disposing of energy and energy byproducts properly. This may lead to destruction of brain cells, and consequently, depression.

All are "very active" areas of research, McIntyre said, "and they are not mutually exclusive."



from Hacker News https://ift.tt/YV9WJO
via IFTTT