Sunday, April 26, 2015

Apple Won’t Always Rule. Just Look at IBM

Apple Won’t Always Rule. Just Look at IBM.

APRIL 25, 2015



Apple can’t grow like this forever. No company can.

In a few short years, Apple has become the biggest company on the planet by market value — so big that it dwarfs every other one on the stock market. It dominates the Standard & Poor’s 500-stock index as no other company has in 30 years.

Apple’s market capitalization — the value of all of the shares of its stock — is more than $758 billion, greater than any other company’s. Yet the Wall Street consensus is that Apple is still having a growth spurt. In fact, if Apple’s watches, phones, laptops and other gadgets and services keep generating favorable publicity — and if its quarterly earnings report on Monday is as strong as the market expects it to be — there’s a reasonable chance that Apple’s value will keep swelling. Not far down the road, it might even reach the $1 trillion level that some hedge funds predict.

But even if Apple still has some room to run, there are some early warning signs. After all, the company has already crossed a significant threshold. In February, it grew to twice the size of the next biggest company in the S.&P. 500, a rare feat of financial dominance, and one that hasn’t happened since Ronald Reagan was president.

I checked the numbers with Howard Silverblatt, senior index analyst at S.&P. Dow Jones Indices. He found that the last market colossus to tower over its competitors by a two-to-one ratio was IBM, which did it in three successive years: 1983, 1984 and 1985. “That was when PCs were new,” he said, “and just about everyone thought IBM would rule the world.”

Now it’s Apple’s world. Apple is the most widely held stock in American mutual fund portfolios. IBM, the former undisputed heavyweight champion, isn’t even in the running anymore. It ranks 62nd, according to a Morningstar analysis performed at my request. IBM is still an important company, but it is struggling. Investors judge it to be worth less than one-quarter of Apple’s market value today. What happened to IBM — how it became this small, in comparison with Apple — is worth remembering.

Market Gains, Now and Then

Apple’s market value is now roughly twice the size of the next biggest company’s in the Standard & Poor’s 500-stock index.

A company had not been in that position since 1985. But while Apple is now bigger than IBM was then, as a proportion of the index IBM was more dominant.

I had forgotten how imposing IBM once was. By some measures, it was vastly more important than Apple is today. Measured by market cap, for example, IBM accounted for a staggering 6.4 percent of the S.&.P. 500 in 1985, IBM’s peak year — making it 2.35 times the size of the second-biggest company of its day, Exxon. Now Microsoft is the second biggest and Exxon Mobil is third, both roughly one-half the size of Apple. Exxon Mobil is followed in market cap by Google and Johnson & Johnson. (On this 45th anniversary year of Earth Day, the staying power of Exxon, from its Standard Oil days to the present, is also worth remembering.)

Apple has an outsize influence today: After the market close on Friday, its share of the S.&.P. 500 was 4.1 percent, a formidable percentage and a huge increase from Dec. 31, when it was 3.35 percent. But its weight in the market is nothing like IBM’s in the 1980s, when IBM finished seven calendar years with a market weight above 4 percent — a showing that Apple has not yet met, the data shows. At IBM’s 1985 peak, its share of the S.&P. 500 was more than one and half times the size of Apple’s today.

IBM operated in a different league than Apple does now. A business machine company at its roots, IBM never aspired to pop-culture coolness, but its prestige was extraordinary. You can’t measure prestige easily with numbers, but consider that in 1987, two IBM scientists based in Zurich won the Nobel Prize in Physics for a breakthrough in superconductivity. It was the second consecutive year that IBM scientists won the prize; in 1986, two of them won it for inventing an instrument known as the scanning tunneling electron microscope. All of those scientists did deep, basic research of which IBM was justly proud. Apple’s research today is impressive, but it has generally been product-driven, not the kind of fundamental work that IBM did.

With hindsight, it’s clear that IBM’s Olympian status was part of its problem. In the 1980s, at the height of its powers, it continued to come up with scientific breakthroughs and ultrafast computers, but its focus on its own product lines and customer service flagged. IBM “naïvely” handed over crucial parts of the computer business to companies like Microsoft and Intel, while its own profit margins began to erode, D. Quinn Mills, a professor at the Harvard Business School, has written.

For the most part, investors minimized those problems, if they were even aware of them. In those days of hulking mainframes, IBM was the quintessential computer company and its hegemony in the stock market seemed unstoppable.

It’s no wonder that a young Steve Jobs, the co-founder of the upstart Apple Computer company, took direct aim at IBM in a speech in San Francisco in the fall of 1983, deriding IBM as arrogant and shortsighted and predicting that it would soon be humbled. At that meeting, he unveiled a remarkable ad that would run on television during the 1984 Super Bowl. Created by the director of “Blade Runner,” Ridley Scott, it showed a young hammer-wielding athlete running through a vast grim room populated by serfs. She hurled her hammer at a screen on which an Orwellian Big Brother was intoning propaganda and shattered it.

A man read the words on-screen: “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984.’ ” Apple didn’t mention IBM, but its target was clear. And soon after the Super Bowl, when Jobs actually introduced the first Macintosh to a rapt audience, the little personal computer continued the assault on IBM. In a cute synthesized voice, it spoke these words, which also appeared on its diminutive screen: “Unaccustomed as I am to public speaking, I’d like share with you a maxim I thought of the first time I met an IBM mainframe: NEVER TRUST A COMPUTER YOU CAN’T LIFT.”

IBM thrived for years afterward, but just as Jobs had predicted, it turned out to be vulnerable to disruptive change, as all big companies are. For decades now, IBM has engaged in a sometimes painful transition, and as it revealed in its quarterly earnings report last week, it is still hurting: Its revenues have declined and it has endured wrenching business shifts. My colleague Steve Lohr wrote last week that IBM has been getting out of slow-growing old businesses, like personal computers, disk drives, low-end server computers and chip manufacturing — but its new initiatives in fields like data analytics, cloud computing and mobile apps for corporate customers haven’t entirely succeeded yet.

In a turnabout, IBM’s mobile app strategy relies on a partnership with the current giant, its old nemesis Apple. IBM is leveraging its prowess with supercomputers and artificial intelligence with a new initiative, Watson Health, that includes Apple. That alliance could help both companies grow — in Apple’s case, by ensuring that its products work more seamlessly in corporate environments where IBM is deeply entrenched.

Rapid growth, after all, isn’t a sure thing, especially when you’re already the biggest company in the world. IBM has proved that. Sooner or later, Apple investors will have to take that lesson to heart.

A version of this article appears in print on April 26, 2015, on page BU4 of the New York edition with the headline: Apple Won’t Always Rule. Just Look at IBM.

Saturday, April 25, 2015

Why the journey to IPv6 is still the road less traveled

Why the journey to IPv6 is still the road less traveled

By Stephen Lawson
IDG News Service | Apr 20, 2015 11:10 AM PT

The writing’s on the wall about the short supply of IPv4 addresses, and IPv6 has been around since 1999. Then why does the new protocol still make up just a fraction of the Internet?

Though IPv6 is finished technology that works, rolling it out may be either a simple process or a complicated and risky one, depending on what role you play on the Internet. And the rewards for doing so aren’t always obvious. For one thing, making your site or service available via IPv6 only helps the relatively small number of users who are already set up with the protocol, creating a nagging chicken-and-egg problem.

The new protocol, which is expected to provide more addresses than users will ever need, has made deep inroads at some big Internet companies and service providers, especially mobile operators. Yet it still drives less than 10 percent of the world’s traffic. This is despite evidence that migrating to IPv6 can simplify networks and even speed up the Web experience.

The good news is that for ordinary enterprises, it can be just a matter of asking your ISP (Internet service provider) or hosting company for IPv6 service. Many of the major ISPs and CDNs (content delivery networks) are equipped to provide both IPv4 and IPv6 connections to a customer’s website, allowing partners and potential customers to reach it over the new technology if they have it.
For example, AT&T offers large enterprises native dual-stack connections to its network, allowing users to reach those companies over either IPv4 or IPv6, said Brooks Fitzsimmons, assistant vice president of technology and operations. That’s for customers with high-speed connections such as Gigabit ethernet. For those with slower links, including consumers, AT&T can encapsulate IPv6 traffic for transport over existing IPv4 connections. Many other big service providers also offer such services.

Yet only about 14 percent of the top 1,000 websites have turned on IPv6.

“Many of them haven’t enabled it yet because of this perceived notion that it’s difficult. It’s not. It just takes a little bit of time,” said Paul Saab, a software engineer at Facebook, which is fully available over IPv6. “It can be done with a very small team,” Saab said.

That’s not true for everyone. Saab acknowledged that ISPs have a harder time of it, and some big enterprises that run their own Internet presence may also face challenges. For them, implementing IPv6 can be a complicated effort involving changes to automated processes and other back-end components, judging from war stories told at last week’s meeting of ARIN (the American Registry for Internet Numbers). Meanwhile, such projects can slip down the priority list in the face of day-to-day operations and dealing with growth.

“It’s a monstrous undertaking by any operator,” said analyst Michael Howard of Infonetics Research, a division of IHS. “The protocols might be robust, but this operation is delicate, because it has to be coordinated with all the routers.”

An error along the way can do more than slow down service—it can cut off users’ access to a website completely. There are software tools to automate the process, but they couldn’t handle every single router, Howard said, as routing code is full of quirky exceptions.

That painstaking effort is why some service providers are waiting to roll out the protocol, he said. In cases where it requires new hardware, they may also be waiting for their current gear to depreciate rather than rip it out early and take a loss. This can take years.

There’s a lot of work involved in any project that spans an entire network, AT&T’s Fitzsimmons said. In AT&T’s case, encapsulating IPv6 in IPv4 allowed the carrier to upgrade most home users’ equipment through software updates, but other changes require new hardware.

There are ways around deploying IPv6, even for service providers that don’t have enough IPv4 addresses for all their subscribers, but those techniques have limitations. The biggest is carrier-grade NAT (network address translation), a large-scale form of the way home and office networks let users share external addresses.

Instead of connecting directly to a site, subscribers go through a proxy that doles out IPv4 addresses from a limited stockpile. That makes a network more complex and can slow things down. It can also keep an Internet-based service from getting all the information it could, according to John Curran, ARIN’s president and CEO. For example, if the proxy that’s providing a user’s address is located in a different city from that user, then location data that could aid in targeting ads would be unusable, he said.

One barrier to getting an IPv6 deployment going is that it doesn’t generate any new revenue—at least for now. But for service providers, at least, the growth of mobile data and the Internet of Things is liable to change that.

Eventually, it’s expected that enterprises and governments deploying millions of sensors, cameras and connected machines will demand so many unique Internet addresses that service providers will only be able to satisfy that demand with IPv6. Holding off will cost them additional business. Howard at Infonetics thinks that day may be just three years away.

AT&T has only recently started to turn on IPv6 on its wireless network, but it believes this is where the technology ultimately will matter most. The carrier reported 19.8 million connected devices in use at the end of last year and expects that to grow quickly with the Internet of Things.

“You can’t do that in perpetuity, successfully, without IPv6,” Fitzsimmons said.

Russian Hackers Read Obama’s Emails

Russian Hackers Read Obama’s Unclassified Emails, Officials Say


WASHINGTON — Some of President Obama’s email correspondence was swept up by Russian hackers last year in a breach of the White House’s unclassified computer system that was far more intrusive and worrisome than has been publicly acknowledged, according to senior American officials briefed on the investigation.

The hackers, who also got deeply into the State Department’s unclassified system, do not appear to have penetrated closely guarded servers that control the message traffic from Mr. Obama’s BlackBerry, which he or an aide carries constantly.

But they obtained access to the email archives of people inside the White House, and perhaps some outside, with whom Mr. Obama regularly communicated. From those accounts, they reached emails that the president had sent and received, according to officials briefed on the investigation.

White House officials said that no classified networks had been compromised, and that the hackers had collected no classified information. Many senior officials have two computers in their offices, one operating on a highly secure classified network and another connected to the outside world for unclassified communications.

But officials have conceded that the unclassified system routinely contains much information that is considered highly sensitive: schedules, email exchanges with ambassadors and diplomats, discussions of pending personnel moves and legislation, and, inevitably, some debate about policy.

Officials did not disclose the number of Mr. Obama’s emails that were harvested by hackers, nor the sensitivity of their content. The president’s email account itself does not appear to have been hacked. Aides say that most of Mr. Obama’s classified briefings — such as the morning Presidential Daily Brief — are delivered orally or on paper (sometimes supplemented by an iPad system connected to classified networks) and that they are usually confined to the Oval Office or the Situation Room.

Still, the fact that Mr. Obama’s communications were among those hit by the hackers — who are presumed to be linked to the Russian government, if not working for it — has been one of the most closely held findings of the inquiry. Senior White House officials have known for months about the depth of the intrusion.

“This has been one of the most sophisticated actors we’ve seen,” said one senior American official briefed on the investigation.

Others confirmed that the White House intrusion was viewed as so serious that officials met on a nearly daily basis for several weeks after it was discovered. “It’s the Russian angle to this that’s particularly worrisome,” another senior official said.

While Chinese hacking groups are known for sweeping up vast amounts of commercial and design information, the best Russian hackers tend to hide their tracks better and focus on specific, often political targets. And the hacking happened at a moment of renewed tension with Russia — over its annexation of Crimea, the presence of its forces in Ukraine and its renewed military patrols in Europe, reminiscent of the Cold War.

Inside the White House, the intrusion has raised a new debate about whether it is possible to protect a president’s electronic presence, especially when it reaches out from behind the presumably secure firewalls of the executive branch.

Mr. Obama is no stranger to computer-network attacks: His 2008 campaign was hit by Chinese hackers. Nonetheless, he has long been a frequent user of email, and publicly fought the Secret Service in 2009 to retain his BlackBerry, a topic he has joked about in public. He was issued a special smartphone, and the list of those he can exchange emails with is highly restricted.

When asked about the investigation’s findings, the spokeswoman for the National Security Council, Bernadette Meehan, said, “We’ll decline to comment.” The White House has also declined to provide any explanations about how the breach was handled, though the State Department has been more candid about what kind of systems were hit and what it has done since to improve security. A spokesman for the F.B.I. declined to comment.

Officials who discussed the investigation spoke on the condition of anonymity because of the delicate nature of the hacking. While the White House has refused to identify the nationality of the hackers, others familiar with the investigation said that in both the White House and State Department cases, all signs pointed to Russians.

On Thursday, Secretary of Defense Ashton B. Carter revealed for the first time that Russian hackers had attacked the Pentagon’s unclassified systems, but said they had been identified and “kicked off.” Defense Department officials declined to say if the signatures of the attacks on the Pentagon appeared related to the White House and State Department attacks.

The discovery of the hacking in October led to a partial shutdown of the White House email system. The hackers appear to have been evicted from the White House systems by the end of October. But they continued to plague the State Department, whose system is much more far-flung. The disruptions were so severe that during the Iranian nuclear negotiations in Vienna in November, officials needed to distribute personal email accounts, to one another and to some reporters, to maintain contact.

Earlier this month, officials at the White House said that the hacking had not damaged its systems and that, while elements had been shut down to mitigate the effects of the attack, everything had been restored.

One of the curiosities of the White House and State Department attacks is that the administration, which recently has been looking to name and punish state and nonstate hackers in an effort to deter attacks, has refused to reveal its conclusions about who was responsible for this complex and artful intrusion into the government. That is in sharp contrast to Mr. Obama’s decision, after considerable internal debate in December, to name North Korea for ordering the attack on Sony Pictures Entertainment, and to the director of national intelligence’s decision to name Iranian hackers as the source of a destructive attack on the Sands Casino.

This month, after CNN reported that hackers had gained access to sensitive areas of the White House computer network, including sections that contained the president’s schedule, the White House spokesman, Josh Earnest, said the administration had not publicly named who was behind the hack because federal investigators had concluded that “it’s not in our best interests.”

By contrast, in the North Korea case, he said, investigators concluded that “we’re more likely to be successful in terms of holding them accountable by naming them publicly.”

But the breach of the president’s emails appeared to be a major factor in the government secrecy. “All of this is very tightly held,” one senior American official said, adding that the content of what had been breached was being kept secret to avoid tipping off the Russians about what had been learned from the investigation.

Mr. Obama’s friends and associates say that he is a committed user of his BlackBerry, but that he is careful when emailing outside the White House system.

“The frequency has dropped off in the last six months or so,” one of his close associates said, though this person added that he did not know if the drop was related to the hacking.

Mr. Obama is known to send emails to aides late at night from his residence, providing them with his feedback on speeches or, at times, entirely new drafts. Others say he has emailed on topics as diverse as his golf game and the struggle with Congress over the Iranian nuclear negotiations.

George W. Bush gave up emailing for the course of his presidency and did not carry a smartphone. But after Mr. Bush left office, his sister’s email account was hacked, and several photos — including some of his paintings — were made public.

The White House is bombarded with cyberattacks daily, not only from Russia and China. Most are easily deflected.

The White House, the State Department, the Pentagon and intelligence agencies put their most classified material into a system called Jwics, for Joint Worldwide Intelligence Communications System. That is where top-secret and “secret compartmentalized information” traverses within the government, to officials cleared for it — and it includes imagery, data and graphics. There is no evidence, senior officials said, that this hacking pierced it.

A version of this article appears in print on April 26, 2015, on page A1 of the New York edition with the headline: Russian Hackers Read Obama’s Unclassified Emails, Officials Say.

Tiny robots climb walls carrying more than 100 times their weight

Tiny robots climb walls carrying more than 100 times their weight
18:30 24 April 2015 by Aviva Rutkin
Mighty things come in small packages. The little robots in this video can haul things that weigh over 100 times more than themselves.

The super-strong bots – built by mechanical engineers at Stanford University in California – will be presented next month at the International Conference on Robotics and Automation in Seattle, Washington.

The secret is in the adhesives on the robots' feet. Their design is inspired by geckos, which have climbing skills that are legendary in the animal kingdom. The adhesives are covered in minute rubber spikes that grip firmly onto the wall as the robot climbs. When pressure is applied, the spikes bend, increasing their surface area and thus their stickiness. When the robot picks its foot back up, the spikes straighten out again and detach easily.

The bots also move in a style that is borrowed from biology. Like an inchworm, one pad scooches the robot forward while the other stays in place to support the heavy load. This helps the robot avoid falls from missing its step and park without using up precious power.

Heavy lifting

All this adds up to robots with serious power. For example, one 9-gram bot can hoist more than a kilogram as it climbs. In this video it's carrying StickyBot, the Stanford lab's first ever robot gecko, built in 2006.

Another tiny climbing bot weighs just 20 milligrams but can carry 500 milligrams, a load about the size of a small paper clip. Engineer Elliot Hawkes built the bot under a microscope, using tweezers to put the parts together.

The most impressive feat of strength comes from a ground bot nicknamed μTug. Although it weighs just 12 grams, it can drag a weight that's 2000 times heavier – "the same as you pulling around a blue whale", explains David Christensen – who is in the same lab.

In future, the team thinks that machines like these could be useful for hauling heavy things in factories or on construction sites. They could also be useful in emergencies: for example, one might carry a rope ladder up to a person trapped on a high floor in a burning building.

But for tasks like these, the engineers may have to start attaching their adhesives to robots that are even larger – and thus more powerful. "If you leave yourself a little more room, you can do some pretty amazing things," says Christensen.

Italian eyewear company working on GOOGLE Glass 2.0...

Luxottica CEO says company is working on Google Glass 2.0

by Billy Steele | @wmsteele | 4 hrs ago 

When Nest CEO Tony Fadell took over Google Glass back in February, he pledged to redesign the headset "from scratch." Well, it looks like that process is well underway. In a company meeting today, Luxottica CEO Massimo Vian said the Italian eyewear company is working with the folks in Mountain View on not one, but two new versions of the device. Luxottica owns brands Ray-Ban and Oakley, and if you'll recall, the company worked with Google on frames for the original version of Glass.

"What you saw was version 1," Vian said. "We're now working on version 2, which is in preparation."

Vian also explained that a third version is in the works, and there are currently "second thoughts" on what it'll look like. Aside from the promised redesign, details are scarce on the new model(s), besides reports that it'll be powered by Intel and aim to be what Google's Eric Schmidt calls "ready for users." Speaking of Intel, Luxottica has its own product in the works with the chip maker that's set to debut in 2016.

Thursday, April 23, 2015

Electronic waste worth £34bn piling up in 'toxic mine', warns UN report

Electronic waste worth £34bn piling up in 'toxic mine', warns UN report
Very little of the discarded electrical equipment, which includes gold and silver, is being recycled - and Britain is among the worst offenders

By Cahal Milmo Sunday 19 April 2015

Gold worth more than £7bn is being thrown away amid the 42 million tons of electronic and electrical equipment discarded by consumers each year, according to United Nations experts.

A report by the United Nations University (UNU) reveals that the amount of “e-waste” generated globally is increasing by two million tons a year and will reach 50 megatons by 2018 – with Britons among the planet’s biggest generators of hi-tech junk.

The study warns that less than 16 per cent of global e-waste is being diverted from landfill into recycling and reuse – representing the loss of an “urban mine” of potentially recyclable materials worth more than £34bn.

Among the resources being lost annually, as millions of items from mobile phones to fridges are inadequately disposed of, are 300 tons of gold (equivalent to more than a 10th of global production in 2013) as well as 1,000 tons of silver worth £400m and 16 megatons of steel with a value of £6.5bn.

The fast-growing mountain of waste also contains alarming quantities of toxins, including 4,400 tons of ozone-depleting chemicals and 2.2 megatons of lead glass weighing more than the equivalent of the Empire State Building.

Heavy metals and other chemicals commonly found in electronics such as mercury, cadmium and beryllium can leach into the ground and water supplies, causing kidney and liver damage and impaired mental development.

David Malone, the UN under-secretary and rector of the Tokyo-based UNU, said: “Worldwide, e-waste constitutes a valuable ‘urban mine’ – a large potential reservoir of recyclable materials. At the same time, the hazardous content of e-waste constitutes a ‘toxic mine’ that must be managed with extreme care. There is a large portion of e-waste that is not being collected and treated in an environmentally sound manner.”

The researchers said that the unquenchable appetite for electronics and appliances both in developed and developing countries was generating enough waste to fill 1.2 million 40-ton lorries each year. A queue of such lorries would stretch from New York to Tokyo and back again.

Rising sales of technology and the shortening life cycle of that equipment are the key drivers of this avalanche of e-waste, with consumers unwilling to hold on to a product when a newer model or innovation comes along, even though the item may still be functional.

A separate study in Britain by Wrap, the government-backed charity which encourages recycling, found that 23 per cent of electric and electronic waste collected from municipal sites was still in good working order or required only a small amount of repair.

The UNU research found that rather than being dominated by discarded electronics such as mobile phones or computers, the majority (nearly 60 per cent) of e-waste consisted of large and small domestic appliances or office equipment. It included 12.8 megatons of smaller items such as microwaves or toasters and 18.8 megatons of “white goods” such as fridges, washing machines, dryers and other large appliances.

Britain is identified as among the world’s most profligate producers of e-waste, ranking fifth in the weight of material discarded per inhabitant, with each Briton generating 23.5kg each year. The UK was also sixth worldwide in the total amount of e-waste the country generated, with some 1.5 megatons – barely 100,000 tons less than India which has 20 times the population.

Experts said Britain, in common with other European Union countries, was missing an opportunity to ensure that its inhabitants’ appetite for consumer durables results in a thriving recycling industry. The UNU report said that only one-third of e-waste in the UK is recycled through recognised schemes – a figure that must reach 85 per cent under EU rules by 2019. Federico Magalini, a UNU researcher, said: “In the UK we are seeing that the ‘lifespan’ of an electric or electronic product may be particularly short.

“We should not simply try to stop consumption to minimise the amount of waste being generated, but should instead make sure that it is properly collected and recycled. There is an opportunity to create jobs and extract those resources currently being discarded.”

In the meantime, the authorities face a mammoth task in trying to shut down illegal waste exports, including e-waste which is sent to the developing world where components are melted down in often primitive conditions.

The Environment Agency estimates that some 11,500 shipping containers are illegally exported from the UK each year containing either household or electrical waste – the equivalent of 200,000 tons of material a year.

An Essex man last year became the first person in Britain to be jailed for the illegal export of e-waste after he was sentenced to 16 months’ imprisonment for smuggling 46 tons of material to Africa.

Joe Benson, 54, was found to have packed containers with waste including broken cathode ray tubes and ozone-depleting refrigerators bound for Nigeria, Ivory Coast and Ghana. He stood to make £8,000 per container.

Wednesday, April 22, 2015

Chinese scientists genetically modify human embryos

Chinese scientists genetically modify human embryos

Rumours of germline modification prove true — and look set to reignite an ethical debate.

By David Cyranoski & Sara Reardon

22 April 2015

Human embryos are at the centre of a debate over the ethics of gene editing.

In a world first, Chinese scientists have reported editing the genomes of human embryos. The results are published1 in the online journal Protein & Cell and confirm widespread rumours that such experiments had been conducted—rumours that sparked a high-profile debate last month about the ethical implications of such work.

In the paper, researchers led by Junjiu Huang, a gene-function researcher at Sun Yat-sen University in Guangzhou, tried to head off such concerns by using 'non-viable' embryos, which cannot result in a live birth, that were obtained from local fertility clinics. The team attempted to modify the gene responsible for β-thalassaemia, a potentially fatal blood disorder, using a gene-editing technique known as CRISPR/Cas9. The researchers say that their results reveal serious obstacles to using the method in medical applications.

"I believe this is the first report of CRISPR/Cas9 applied to human pre-implantation embryos and as such the study is a landmark, as well as a cautionary tale," says George Daley, a stem-cell biologist at Harvard Medical School in Boston. "Their study should be a stern warning to any practitioner who thinks the technology is ready for testing to eradicate disease genes."

Some say that gene editing in embryos could have a bright future because it could eradicate devastating genetic diseases before a baby is born. Others say that such work crosses an ethical line: researchers warned in Nature2 in March that because the genetic changes to embryos, known as germline modification, are heritable, they could have an unpredictable effect on future generations. Researchers have also expressed concerns that any gene-editing research on human embryos could be a slippery slope towards unsafe or unethical uses of the technique.

The paper by Huang's team looks set to reignite the debate on human-embryo editing — and there are reports that other groups in China are also experimenting on human embryos.

Problematic gene

The technique used by Huang’s team involves injecting embryos with the enzyme complex CRISPR/Cas9, which binds and splices DNA at specific locations. The complex can be programmed to target a problematic gene, which is then replaced or repaired by another molecule introduced at the same time. The system is well studied in human adult cell and in animal embryos. But there had been no published reports of its use in human embryos.

Huang and his colleagues set out to see if the procedure could replace a gene in a single-cell fertilized human embryo; in principle, all cells produced as the embryo developed would then have the repaired gene. The embryos they obtained from the fertility clinics had been created for use in in vitro fertilization but had an extra set of chromosomes, following fertilization by two sperm. This prevents the embryos from resulting in a live birth, though they do undergo the first stages of development.

Huang’s group studied the ability of the CRISPR/Cas9 system to edit the gene called HBB, which encodes the human β-globin protein. Mutations in the gene are responsible for β-thalassaemia.

Serious obstacles

The team injected 86 embryos and then waited 48 hours, enough time for the CRISPR/Cas9 system and the molecules that replace the missing DNA to act — and for the embryos to grow to about eight cells each. Of the 71 embryos that survived, 54 were genetically tested. This revealed that just 28 were successfully spliced, and that only a fraction of those contained the replacement genetic material. “If you want to do it in normal embryos, you need to be close to 100%,” Huang says. “That’s why we stopped. We still think it’s too immature.”

His team also found a surprising number of ‘off-target’ mutations assumed to be introduced by the CRISPR/Cas9 complex acting on other parts of the genome. This effect is one of the main safety concerns surrounding germline gene editing because these unintended mutations could be harmful. The rates of such mutations were much higher than those observed in gene-editing studies of mouse embryos or human adult cells. And Huang notes that his team likely only detected a subset of the unintended mutations because their study looked only at a portion of the genome, known as the exome. “If we did the whole genome sequence, we would get many more,” he says.

Ethical questions

Huang says that the paper was rejected by Nature and Science, in part because of ethical objections; both journals declined to comment on the claim (Nature’s news team is editorially independent of its research editorial team.)

He adds that critics of the paper have noted that the low efficiencies and high number of off-target mutations could be specific to the abnormal embryos used in the study. Huang acknowledges the critique, but because there are no examples of gene editing in normal embryos he says that there is no way to know if the technique operates differently in them.

Still, he maintains that the embryos allow for a more meaningful model — and one closer to a normal human embryo — than an animal model or one using adult human cells. “We wanted to show our data to the world so people know what really happened with this model, rather than just talking about what would happen without data,” he says.

But Edward Lanphier, one of the scientists who sounded the warning in Nature last month, says: "It underlines what we said before: we need to pause this research and make sure we have a broad based discussion about which direction we’re going here." Lanphier is president of Sangamo Biosciences in Richmond, California, which applies gene-editing techniques to adult human cells.

Huang now plans to work out how to decrease the number of off-target mutations using adult human cells or animal models. He is considering different strategies — tweaking the enzymes to guide them more precisely to the desired spot, introducing the enzymes in a different format that could help to regulate their lifespans and thus allow them to be shut down before mutations accumulate, or varying the concentrations of the introduced enzymes and repair molecules. He says that using other gene-editing techniques might also help. CRISPR/Cas9 is relatively efficient and easy to use, but another system called TALEN is known to cause fewer unintended mutations.

The debate over human embryo editing is sure to continue for some time, however. CRISPR/Cas9 is known for its ease of use and Lanphier fears that more scientists will now start to work towards improving on Huang's paper. “The ubiquitous access to and simplicity of creating CRISPRs," he says, "creates opportunities for scientists in any part of the world to do any kind of experiments they want.”

A Chinese source familiar with developments in the field said that at least four groups in China are pursuing gene editing in human embryos.

Nature doi:10.1038/nature.2015.17378


1.Liang, P. et al. Protein Cell (2015).

2.Lanphier, E. et al. Nature 519, 410–411 (2015).

3.Baltimore, D. et al. Science 348, 36–38 (2015).