Archive for the ‘(de)extinction files’ Category

How Impossible, Actually, Is the Dinosaur DNA Splicing in Jurassic World?

Well, there’s just one problem: Dinos are not like strawberries. In the case of GMO crops, we’re talking about isolating one gene that codes for one specific trait. In the case of Jurassic World, we’re talking about traits that involve hundreds of genes. Take camouflage, the trait that (spoiler alert!) so surprises the Indominus rex’s trainers. Blending in with your surroundings requires tweaks to neural genes, skin genes, hormonal genes, temperature sensitivity genes. “It’s likely a whole suite of genes,” says Beth Shapiro, a professor of ecology and evolutionary biology at the University of California at Santa Cruz and author of How to Clone a Mammoth: The Science of De-Extinction.

In other words, it’s not a simple matter of genetic cut-and-pasting. “When genomes evolve, they don’t do so in isolation,” says Shapiro. “They do so in the background of the entire genome.” Many of the genes you’re messing with are pleiotropic—that is, they code for several different characteristics. And it’s not like all of them are located in one place; they’re distributed all over the genome. You start to appreciate the difficulty. Shapiro compares the challenge to trying to swap out an elephant’s forelegs for wings. “I can’t cut out a wing gene, insert into an elephant and assume I’m going to get an elephant with wings,” she told me, not without a touch of exasperation. “There is no wing gene.”

There’s a bigger reason this wouldn’t work. Though we’ve sequenced hundreds of animal genomes, we still don’t know exactly how each one functions as a whole. You might say we have the vocabulary to describe the language of biology, but we haven’t yet mastered the grammar. As DeSalle puts it: “We’ve had the chicken genome sequences for a decade now—and we still don’t know chickenshit about it.”

Renowned paleontologist Jack Horner has spent his career trying to reconstruct a dinosaur. He’s found fossils with extraordinarily well-preserved blood vessels and soft tissues, but never intact DNA. So, in a new approach, he’s taking living descendants of the dinosaur (chickens) and genetically engineering them to reactivate ancestral traits — including teeth, tails, and even hands — to make a “Chickenosaurus”.

Dino-Chicken: Wacky But Serious Science Idea of 2011

LiveScience: So if you could bring a dinosaur back — the real thing, not a modified chicken — what species would you choose?

Horner: A little one. A little plant-eater.

LiveScience: No T. rex for you?

Horner: Would you make something that would turn around and eat you? Sixth-graders would do that, but I’d just as soon make something that wouldn’t eat me. And you could have it as a pet without worrying about it eating the rest of your pets.

If Science Could ‘Clone A Mammoth,’ Could It Save An Elephant?

Until we figure out how to meet the physical and psychological needs of elephants in captivity, they shouldn’t be in captivity at all, much less being used to make mammoths. If we were to put that all aside, I don’t want to see mammoths come back — it’s never going to be possible to create a species that is 100 percent identical. But what if we could use this technology not to bring back mammoths but to save elephants?

What if we could use this technology to make elephants slightly better adapted to cooler climates, the type of place that mammoths used to live? We could then create more space for them. … Mammoths and elephants have approximately 99 percent identical genomes. If we are talking about changing a few genes here and there to make them better adapted to living in the cold, I think we are talking about preserving elephants.

I think that the key use of this technology … is to protect species and populations that are alive today. Take, for example, the black-footed ferrets that are living across the plains of North America. Black footed ferrets nearly went extinct a couple decades ago because of extermination programs. Today, black-footed ferrets are threatened by a disease. What if we could use this same technology that we’re talking about to go back in time, to sequence DNA of ferrets in museums somewhere that are decades or centuries or even thousands of years old, and find genetic diversity in those that we could then inject in the populations today that have no genetic diversity?

Maybe we could use this technology to give those populations a little bit of a genetic booster shot and maybe a fighting a chance against the diseases that are killing them. We’re facing a crisis — a conservation, biodiversity crisis. This technology might be a very powerful new weapon in our arsenal against what’s going on today. I don’t think we should dismiss it out of fear.

GENiSYSS-DNA-Vault-high-res-with-background

Medical science is rapidly progressing. Currently, DNA is used to identify predispositions to disease including cancer, diabetes, heart disease, autism and many others. It has also discovered that DNA changes in response to daily exposure to pollution, chemicals, smoke, the sun and even stress. Soon, DNA taken from earlier in life may be used therapeutically to treat or prevent disease later in life.

With the Family Vault, the DNA samples can be valuable for discovering specific diseases passed along genetically. The lack of an ancestor’s DNA sample can make the process of identifying a condition difficult or impossible, plus some genetically influenced conditions skip generations. Be prepared for future generations with a complete family history.

Consider getting one Family DNA Vault per person so you can capture DNA over each member’s lifetime. For example, save a tiny drop of blood at birth, one at 12 years old, another at 25 and 50. When they have a family, they can add others to it.

 

A coalition of geneticists and computer programmers calling itself the Global Alliance for Genomics and Health is developing protocols for exchanging DNA information across the Internet. The researchers hope their work could be as important to medical science as HTTP, the protocol created by Tim Berners-Lee in 1989, was to the Web.

One of the group’s first demonstration projects is a simple search engine that combs through the DNA letters of thousands of human genomes stored at nine locations, including Google’s server farms and the University of Leicester, in the U.K. According to the group, which includes key players in the Human Genome Project, the search engine is the start of a kind of Internet of DNA that may eventually link millions of genomes together.

The technologies being developed are application program interfaces, or APIs, that let different gene databases communicate. Pooling information could speed discoveries about what genes do and help doctors diagnose rare birth defects by matching children with suspected gene mutations to others who are known to have them.

The researchers felt they had to act because the falling cost of decoding a genome—then about $10,000, and now already closer to $2,000—was producing a flood of data they were not prepared for. They feared ending up like U.S. hospitals, with electronic systems that are mostly balkanized and unable to communicate.

The way genomic data is siloed is becoming a problem because geneticists need access to ever larger populations. They use DNA information from as many as 100,000 volunteers to search for genes related to schizophrenia, diabetes, and other common disease. Yet even these quantities of data are no longer seen as large enough to drive discovery. “You are going to need millions of genomes,” says David Altshuler, deputy director of the Broad Institute in Cambridge and chairman of the new organization. And no single database is that big.

The Global Alliance thinks the answer is a network that would open the various databases to limited digital searches by other scientists. Using that concept, says Heidi Rehm, a Harvard Medical School geneticist, the alliance is already working on linking together some of the world’s largest databases of information about the breast cancer genes BRCA1 and BRCA2, as well as nine currently isolated databases containing data about genes that cause rare childhood diseases.

 

 

seed vault dsc_0844_inngansparti_kunst_f_mari_tefre

The Svalbard Global Seed Vault, which is established in the permafrost in the mountains of Svalbard, is designed to store duplicates of seeds from seed collections around the globe.

 

Moscow State University has secured Russia’s largest-ever scientific grant to collect the DNA of every living and extinct creature for the world’s first database of its kind.

“I call the project ‘Noah’s Ark.’ It will involve the creation of a depository – a databank for the storing of every living thing on Earth, including not only living, but disappearing and extinct organisms. This is the challenge we have set for ourselves,” MSU rector Viktor Sadivnichy told journalists.

The gigantic ‘ark’, set to be completed by 2018, will be 430 sq km in size, built at one of the university’s central campuses.

“It will enable us to cryogenically freeze and store various cellular materials, which can then reproduce. It will also contain information systems. Not everything needs to be kept in a petri dish,” Sadivnichy added.

The university’s press office has confirmed that the resulting database will contain collected biomaterials from all of MSU’s branches, including the Botanical Garden, the Anthropological Museum, the Zoological Museum and others. All of the university’s departments will be involved in research and collation of materials. The program, which has received a record injection of 1 billion rubles (US$194 million), will promote participation by the university’s younger generation of scientists.

Sadovnichy also said that the bank will have a link-up to other such facilities at home, perhaps even abroad.

 

Existing facilities, such as…

Officials of the American Museum of Natural History and the U.S. National Park Service have signed an agreement for samples from endangered species in America’s parks to be added to the museum’s existing DNA collection.

The frozen samples provide researchers with genetic materials to study and help protect hundreds of species. The first new submissions will be blood samples from foxes in California’s Channel Islands National Park, followed by specimens from the American crocodile and the Hawaiian goose.

Underground in the laboratories of the museum a half-dozen metal vats cooled with liquid nitrogen can store up to 1 million frozen tissue samples. They’re stored on racks in bar-coded boxes that are linked to a computer database so they can be located in seconds.

The park service doesn’t have such a state-of-the-art facility. With this kind of DNA analysis it can better manage existing animal populations, using genetic relationships among the samples to trace animals’ movements on land and estimate population sizes.

The samples will provide researchers “with a uniform method to collect, analyze and store genetic material collected in parks,” acting National Park Service Director Dan Wenk said.

The lab is part of the Ambrose Monell Collection for Molecular and Microbial Research, which has allowed geneticists to use its samples for free since 2001. Researchers collect tissue samples from animals in the wilderness — an effort essential to Earth’s biodiversity at a time of massive species loss.

Wenk said the DNA samples going to the Manhattan museum are “a great asset” to the Endangered Species Act of 1973, which aims to restore all federally listed threatened and endangered species “to the point where they are again viable, self-sustaining members of their ecological communities.”

Julie Feinstein, who heads the museum’s sample collection, emphasizes that although DNA is extracted from tissue, cloning “is not part of our mission.”

The main goal, museum officials said, is preservation of species.

 

From this site anyone can link to the first fully sequenced black-footed ferret nuclear genomes—from four representative specimens—and participate in the analysis and interpretation of the data.

The black-footed ferret is the ideal animal for this kind of research.  Over 25 years of captive breeding by US Fish & Wildlife, the Smithsonian, and other institutions has yielded some 8,300 ferrets and highly detailed stud books of their breeding record.  That kind of attention was necessary because the entire living population is descended from only seven founders.  They were part of a tiny remnant population of wild ferrets found in Wyoming in 1981, after the species had been given up for extinct.  Their gene pool may have already been severely bottlenecked.

One of America’s most endangered animals, the black-footed ferret, can become a model for developing genomic diagnosis and genetic rescue techniques that could help many endangered species similarly threatened by the “extinction vortex” of progressive inbreeding and genetic drift.

The problem is the continuing decline of genetic variability in the black-footed ferret’s gene pool.  Any solution will require discovering the exact nature of that decline—genomic diagnosis.  And then techniques may be developed to restore genetic variability—genetic rescue.

 

http://research.ncsu.edu/ges In his talk “The Future of Human Genomics and Synthetic Biology,” Church discussed the exponentially fast pace of emerging genetic technologies (due in part to his own inventions and advancements in the fields of genetics and synthetic biology) and the application of these technologies to present and future work. Synthetic biology, which includes altering gene sequences and expression of genes in living organisms, relies on existing and emerging technologies to manipulate and reconstruct genes and genomes.

Church noted that we have been genetically engineering humans for decades…

Many technologies in synthetic biology exist and continue to develop. The difficulty largely lies in deciding which technological implementations to allow. Church noted future applications such as releasing more genetically modified organisms (GMOs) into the wild, altering ageing genes to extend human life, and manipulating genes to help humans adapt to life in space

pg1 pg2 pg3 pg4 pg5

Thylacines

The thylacine (/ˈθaɪləsiːn/ thy-lə-seen, or /ˈθaɪləsaɪn/ thy-lə-syn, also /ˈθaɪləsɨn/; binomial name: Thylacinus cynocephalus, Greek for “dog-headed pouched one”) was the largest known carnivorous marsupial of modern times. It is commonly known as the Tasmanian tiger (because of its striped back) or the Tasmanian wolf. Native to continental Australia, Tasmania and New Guinea, it is thought to have become extinct in the 20th century. It was the last extant member of its family, Thylacinidae; specimens of other members of the family have been found in the fossil record dating back to the early Miocene.

The thylacine had become extremely rare or extinct on the Australian mainland before British settlement of the continent, but it survived on the island of Tasmania along with several other endemic species, including the Tasmanian devil. Intensive hunting encouraged by bounties is generally blamed for its extinction, but other contributing factors may have been disease, the introduction of dogs, and human encroachment into its habitat.

Bagged_thylacine_wikimediacommons-o-matic

Despite its official classification as extinct, sightings are still reported, though none have been conclusively proven.

Surviving evidence suggests that it was a relatively shy, nocturnal creature with the general appearance of a medium-to-large-size dog, except for its stiff tail and abdominal pouch (which was reminiscent of a kangaroo) and a series of dark transverse stripes that radiated from the top of its back (making it look a bit like a tiger).

Like the tigers and wolves of the Northern Hemisphere, from which it obtained two of its common names, the thylacine was an apex predator. As a marsupial, it was not closely related to these placental mammals, but because of convergent evolution it displayed the same general form and adaptations. Its closest living relative is thought to be either the Tasmanian devil or numbat. The thylacine was one of only two marsupials to have a pouch in both sexes (the other being the water opossum). The male thylacine had a pouch that acted as a protective sheath, covering his external reproductive organs while he ran through thick brush. The thylacine has been described as a formidable predator because of its ability to survive and hunt prey in extremely sparsely populated areas.

Thylacinus

[SEE ALSO: Mass killings of Tasmanian Aborigines were reported as having occurred as part of the Black War in Van Diemen’s Land.]

The thylacine is a candidate for cloning and other molecular science projects due to its recent demise and the existence of several well preserved specimens.

thylacine_sleeping-omatic

tumblr_ne0s4s6dzb1sq8ogdo1_500

As a scientific concept, extinction is distinguished from its theological and apocalyptic variant by the work of naturalists and zoologists such as Georges Cuvier and the Comte de Buffon. Attempting to discover a scientific framework for studying life that would avoid the religious framework of the Great Chain of Being, the study of fossils became a key locus for investigating the emergence and disappearance of living beings. Cuvier, in particular, became a proponent of “catastrophism,” the theory that the Earth is periodically visited by sudden, cataclysmic events that not only radically alter the Earth’s geological composition, but the organisms living on the Earth as well. In the late 18th and early 19th century, Cuvier published a number of archaeological studies that established extinction as a scientific reality, culminating in his multi-volume work, Recherches sur les Ossemens Fossiles de Quadrupeds. As Cuvier provocatively notes, behind the revolutions of nations there lies another type of revolution, that of the planet itself:

“The ancient history of the globe, the definitive term towards which all research tends, is also in itself one of the most curious objects to have captured the enlightened mind; and, if one allows oneself to follow, in the infancy of our species, the nearly invisible traces of so many extinct nations, one will also find there, gathered in the shadows of the Earth’s infancy, the traces of revolutions anterior to the existence of all nations.”

~ In The Dust Of This Planet

Everybody talks about the Tunguska Event, nobody mentions the Carrington Event.

tumblr_ne0s4s6dzb1sq8ogdo2_500

“In the past two days, the sun has unleashed three monster solar flares from a sunspot group the size of Jupiter. These powerful phenomena are amazing to watch, but if they were pointed toward the Earth, they would spell big trouble. Radiation from the sun’s coronal mass ejections (CMEs) could disrupt our power grids and satellites.

Unfortunately, the sun and its atmosphere are devilishly hard to predict. But new research published today in Nature reveals new information about how CMEs form, which could help scientists improve their forecast.”

solar-flare-02-1014-de

All this may seem like doomsaying, but the historic record suggests otherwise: The Halloween Storm, in fact, pales in comparison to several earlier events. In 1989, ground currents from a less intense geomagnetic storm knocked out a high-voltage transformer at a hydroelectric power plant Quebec, plunging the Canadian province into a prolonged 9-hour blackout on an icy winter night. A far more extreme geomagnetic storm washed over the Earth in May of 1921, its magnitude illustrated in world-girdling aurorae and in fires that broke out in telegraph offices, telephone stations, and railroad routing terminals — sites that sucked up geomagnetic currents traveling through nascent power grids. An even more extreme storm in September 1859 caused geomagnetic currents so strong that for days telegraph operators could disconnect their equipment from battery power and send messages solely via the “auroral current” induced in their transmission lines. The 1859 storm is known as the “Carrington Event,” after a British astronomer who witnessed an associated solar flare and connected it with the subsequent earthbound disturbances.

“The physics of the Sun and of Earth’s magnetic field have not fundamentally changed, but we have,” Kappenman says. “We decided to build the power grids, and we’ve progressively made them more vulnerable as we’ve connected them to every aspect of our lives. Another Carrington Event is going to occur someday.” But unlike in 1859, when the telegraph network was the sole technology endangered by space weather, or in 1921, when electrification was in its infancy, today’s vulnerable systems are legion.

20141025_175734_4096_0131_crop

Not everyone is optimistic that our modern society will successfully address the problem—including physicist Avi Schnurr, who is also the president of the Electric Infrastructure Security Council, a non-governmental organization advocating space-weather resilience. “If a Carrington Event happened right now it probably wouldn’t be a wake-up alarm—it would be a goodnight call,” he says. “This is a case where we have to do something that is not often successfully achieved by governments, and certainly not by democracies: We have to take concerted action against a predicted threatening event without having actually experienced the event itself in modern times.”

Protecting the power grid on Earth is, in principle, relatively straightforward. (Countries such as Finland and Canada have already begun to take action, with promising results.) Most high-voltage transformers are directly connected to the ground to neutralize power surges from lightning strikes and other transient phenomena. They’re vulnerable to space weather because geomagnetic currents flow upward through these ground connections.

By placing arrays of electrical resistors or capacitors as intermediaries between the ground and critical transformers, like those serving nuclear power plants and major metropolitan areas, that connection would be severed—and the space-weather threat greatly reduced if not entirely eliminated. Experts estimate this could be accomplished within a few years, at a cost of hundreds of thousands of dollars per transformer. In practice, however, it’s not so easy. So far, U.S. power companies have balked at voluntary installation of such devices, and current government regulations don’t require such protections.

Most_distant_Gamma-ray_burst

In a paper published on arXiv, an online repository, two astronomers, Tsvi Piran of the Hebrew University of Jerusalem and Raul Jimenez of the University of Barcelona, argue that some regions of the galaxy are less friendly to life than others. Moreover, the friendly areas may have been smaller in the past than they are now. If that is true, then it may be the case that complex life on Earth is just about as ancient as it is possible for complex life to be. And, since complexity necessarily precedes intelligence, that might mean human beings really are the first intelligent life forms to evolve in the Milky Way.

Dr Piran and Dr Jimenez are interested in gamma-ray bursts (GRBs), the most energetic phenomena yet discovered in the universe. No one is certain what causes them, but the leading theories are a hypernova—the sudden collapse of a massive star to form a black hole—or a collision between two neutron stars, the ultra-dense remnants of supernovas (slightly less massive collapsed stars). What is not in doubt is their prodigious power: a typical GRB generates as much energy in a few seconds as a star will in its entire multi-billion-year lifetime. That would be bad news for any life-bearing planet which was too close.

The idea that a nearby GRB (nearby, in this context, means within about 10,000 light-years) would wreck the biosphere of an Earthlike planet was proposed in 1999 by James Annis of Fermilab, in Illinois. First, the blast of radiation would instantly kill most living organisms on or near the surface—not just those facing the blast but also, via secondary showers of charged particles and re-emitted gamma rays, those on the hemisphere facing away from it. Second, the gamma rays would also stir up chemical reactions that create ozone-killing molecules sufficiently powerful to destroy more than 90% of an Earthlike planet’s ozone layer, and keep it destroyed for several years. This would let in intense ultraviolet light from the planet’s parent star, which would blitz any complex biological molecules it hit. Anything that survived the initial blast would thus be subjected to years of serious sunburn.

The Earthlike planet of most interest to human beings is, of course, Earth itself. Mankind’s home is 4.6 billion years old, and Dr Piran’s and Dr Jimenez’s model suggests there is almost a 90% chance that it has been hit by at least one GRB of this power in that period. For the first half of Earth’s existence, only the direct impact would have mattered, since there was no ozone layer to annihilate (the simple bacteria which existed at this time were either adapted to UV, or lived underground or underwater and were thus immune to its effects). But once photosynthesis started (about 2.3 billion years ago), oxygen—and therefore ozone, the triatomic form of that element—began to accumulate, and living things came out of hiding and got used to living under its protection. From then on, a nearby GRB would certainly have caused a mass extinction.

Any extinction that happened before about 540m years ago, when shelly animals appeared and fossils became commonplace, would probably be invisible in the geological record. But since then there have been five—one of which, that at the end of the Ordovician period, has no obvious explanation. Perhaps not coincidentally, Dr Piran’s and Dr Jimenez’s model suggests there is a 50% chance Earth has been struck by a GRB in the past 500m years.

SUPPLEMENTARY NOTES #FRINGECULTURE

51M2qeAdgaL._SY344_BO1,204,203,200_

Follow this multi-disciplinary, scientific study as it examines the evidence of a great global catastrophe that occurred only 11,500 years ago. Crustal shifting, the tilting of Earth’s axis, mass extinctions, upthrusted mountain ranges, rising and shrinking land masses, and gigantic volcanic eruptions and earthquakes–all indicate that a fateful confrontation with a destructive cosmic visitor must have occurred. The abundant geological, biological, and climatological evidence from this dire event calls into question many geological theories and will awaken our memories to our true–and not-so-distant–past.

Everybody talks about the Tunguska Event, nobody mentions the theorised several orders of magnitude greater Younger Dryas Impact. Why is that?

The Younger Dryas impact hypothesis, also known as the Clovis comet hypothesis, is one of the competing scientific explanations for the onset of the Younger Dryas cold period. The hypothesis, which scientists continue to debate, proposes that the climate of that time was cooled by the impact or air burst of one or more comets..

The general hypothesis states that about about 12,900 BP calibrated (10,900 14C uncalibrated) years ago, air burst(s) or impact(s) from a near-Earth object(s) set areas of the North American continent on fire, disrupted climate and caused the extinction of most of the megafauna in North America and the demise of the North American Clovis culture after the last glacial period. The Younger Dryas ice age lasted for about 1,200 years before the climate warmed again. This swarm is hypothesized to have exploded above or possibly on the Laurentide Ice Sheet in the region of the Great Lakes. Though no major impact crater has been identified, the proponents suggest that it would be physically possible for such an air burst to have been similar to but orders of magnitude larger than the Tunguska event of 1908. The hypothesis proposed that animal and human life in North America not directly killed by the blast or the resulting wildfires would have suffered due to the disrupted ecologic relationships affecting the continent.

Further reading: Evidence for an extraterrestrial impact 12,900 years ago that contributed to the megafaunal extinctions and the Younger Dryas cooling – R. B. Firestone, A. West, et al

Recent evidence continues to oppose the YDB impact hypothesis. New research, which analyzed sediments claimed, by the hypothesis proponents, to be deposits resulting from a bolide impact were, in fact, dated from much later or much earlier time periods than the proposed date of the cosmic impact. The researchers examined 29 sites that are commonly referenced to support the impact theory to determine if they can be geologically dated to around 13,000 years ago. Crucially, only 3 of the sites actually date from that time. According to the researchers, the Younger Dryas impact event evidence “fails the critical chronological test of an isochronous event at the YD onset, which, coupled with the many published concerns about the extraterrestrial origin of the purported impact markers, renders the YDIH unsupported. There is no reason or compelling evidence to accept the claim that a cosmic impact occurred ∼12,800 y ago and caused the Younger Dryas.

Further readingOBSERVATION OF 23 SUPERNOVAE THAT EXPLODED <300 pc FROM EARTH DURING THE PAST 300 kyr -R. B. Firestone

Firestone (2014) asserted evidence for numerous (23) nearby (d<300 pc) supernovae within the Middle and Late Pleistocene. If true, this would have strong implications for the irradiation of the Earth; at this rate, mass extinction level events due to supernovae would be more frequent than 100 Myr. However, there are numerous errors in the application of past research. The paper overestimates likely nitrate and 14C production from moderately nearby supernovae by about four orders of magnitude. Moreover, the results are based on wrongly selected (obsolete) nitrate and 14C datasets. The use of correct and up-to-date datasets does not confirm the claimed results. The claims in the paper are invalidated.

If the Singularity can not be prevented or confined, just how badcould the Post-Human era be? Well … pretty bad. The physical extinction of the human race is one possibility. (Or as Eric Drexler put it of nanotechnology: Given all that such technology can do, perhaps governments would simply decide that they no longer need citizens!). Yet physical extinction may not be the scariest possibility. Again, analogies: Think of the different ways we relate to animals. Some of the crude physical abuses are implausible, yet….In a Post-Human world there would still be plenty of niches where human equivalent automation would be desirable: embedded systems in autonomous devices, self-aware daemons in the lower functioning oflarger sentients. (A strongly superhuman intelligence would likely be a Society of Mind with some very competent components.) Some of these human equivalents might be used for nothing more than digital signal processing. They would be more like whales than humans. Others might be very human-like, yet with a one-sidedness, a dedication that would put them in a mental hospital in our era. Though none of these creatures might be flesh-and-blood humans, they might be the closest things in the new environment to what we call human now. (I. J.Good had something to say about this, though at this late date the advice may be moot: Good proposed a “Meta-Golden Rule”, which might be paraphrased as “Treat your inferiors as you would be treated by your superiors.” It’s a wonderful, paradoxical idea (and most of my friends don’t believe it) since the game-theoretic payoff is so hard to articulate. Yet if we were able to follow it, in some sense that might say something about the plausibility of such kindness in this universe.)

I have argued above that we cannot prevent the Singularity,that its coming is an inevitable consequence of the humans’ natural competitiveness and the possibilities inherent in technology. And yet… we are the initiators.

dna_nano_tech-wide

A textbook dystopia – and Moravec is just getting wound up. He goes on to discuss how our main job in the 21st century will be “ensuring continued cooperation from the robot industries” by passing laws decreeing that they be “nice,” and to describe how seriously dangerous a human can be “once transformed into an unbounded superintelligent robot.” Moravec’s view is that the robots will eventually succeed us – that humans clearly face extinction.

0CD3A7F0-BCE8-4032-825FCEF0E90C4AF2_article

Given the incredible power of these new technologies, shouldn’t we be asking how we can best coexist with them? And if our own extinction is a likely, or even possible, outcome of our technological development, shouldn’t we proceed with great caution?

MV5BOTc0MzI5ODMwNF5BMl5BanBnXkFtZTcwNzY0OTcxNA@@._V1__SX1220_SY643_

To continue, what Bill proposed to avoid (according to him) the planetary destruction and the extinction of human and animal species by techno-advance is “…to renounce them, restricting research in the technological domains that are too dangerous, putting limits on our research of certain knowledge.” But what is not analyzed is that Technology never stops, always tending toward the Domination on greater and smaller scales.

Perhaps there are some scientists who believe that continuation in the study of nanotechnology would be an immoral error, and therefore leave their work and academic positions, but there will be others continuing as couriers of civilized progress who do not stop for, nor at, anything.

transhumanist-evolution

A subsequent book, Unbounding the Future: The Nanotechnology Revolution, which Drexler cowrote, imagines some of the changes that might take place in a world where we had molecular-level “assemblers.” Assemblers could make possible incredibly low-cost solar power, cures for cancer and the common cold by augmentation of the human immune system, essentially complete cleanup of the environment, incredibly inexpensive pocket supercomputers – in fact, any product would be manufacturable by assemblers at a cost no greater than that of wood – spaceflight more accessible than transoceanic travel today, and restoration of extinct species.

Put simply, Fedorov defined the “common task” as the abolition of death, and resurrection of the dead – all the dead, from all generations. I said in my last lecture that initially his ideas appear far more eccentric than Solov’ev’s, but I would suggest that is only the case at first glance. And that’s because of the way he approached this idea. The clue here is in the word “task.” Fedorov does not believe that the dead will simply start rising from their graves at some point. Rather, humanity needs to direct its work towards the sacred task of physically resurrecting the dead; this is active resuscitation, not passive resurrection (Lord, p. 410), and his meaning is not figurative, it is literal.

vlcsnap-2014-11-08-20h08m26s244

What is significant here is that he is not referring solely to spiritual work; his idea is that all branches of knowledge should be harnessed towards fulfilment of the common task, from history and museum studies (learning about our dead ancestors) to biology (understanding the physical make-up of human beings) to physics and technology (everything from controlling the weather and gravity to space exploration – of which more in a second). Walicki (who was not a great fan of Fedorov’s, and thinks his importance has been exaggerated) says on this: “he had an almost magical belief in man’s ability to master the forces of nature and to use them to find a solution to ‘ultimate issues’.” (386) Perhaps that is the case, but I would at least add that Fedorov’s contention was that knowledge/learning/technological advances had been limited in the past because of the disunified nature of society, which meant that those he called “the learned” were not focusing their energies in the right direction; therefore once they were aware of the common task and the role they had to play in it, and all learning was directed towards this aim, technological progress would be made.

The fusion of religious impetus with technological advances is one of the most striking aspects of Fedorov’s conception of the common task. Among other inventions, he envisaged rocket science and space travel as essential developments, because the means to resurrect the dead do not exist on earth due to the process of disintegration:

to recover particles of disintegrated ancestors, Fedorov imagined, research teams [would travel] to the moon, the planets, and to distant points throughout the universe. Eventually these outer points of the cosmos would be inhabited by the resurrected ancestors, whose bodies might be synthesized so as to live under conditions that could not now support human life as it is known (Young, p. 15). “

And although this may have seemed rather far-fetched and more suitable for fiction than religious philosophy at the end of the nineteenth century, it is well known that one of Fedorov’s disciples was the father of Russian rocket science Konstantin Tsiolkovsky (1857-1935), who spent three years studying in the Rumyantsev museum where Fedorov worked, and who later propounded a theory of cosmism that had much in common with Fedorov’s, as it involved space colonization as a route to human perfection and immortality.