Mobile data offloading with WiFi and Femtocells

Mobile data offloading is the use of complementary network technologies for delivering data originally targeted for cellular networks. For mobile operators the main purpose for the offloading is to prevent congestion of the cellular networks.

The main complementary network technologies used for the mobile data offloading are WiFi and Femtocells. Wi-Fi offloading is an emerging business domain with multiple companies entering to the market with proprietary solutions. Depending on the services to be offloaded and the business model there may be a need for interworking between WiFi and mobile cellular networks. Three solutions have been analyzed so far :

  • Enhanced Generic Access Network (EGAN), a tight coupling architecture, specified by 3GPP (3rd Generation Partnership Project)
  • Interworking Wireless LAN (IWLAN), a loose coupling architecture, specified by 3GPP
  • Direct connection to the public Internet, the most straightforward way to offload data to the WiFi networks

There are three main initiation schemes of offloading procedures: WLAN scanning by ANDS (Access network discovery and selection function), user initiation or remotely-managed initiation.

Mobile data offloading for 3G/4G network using Femtocell technology is promoted by the industry Small Cell Forum.

DAS, SAN, RAID and NAS

Last update : September 11, 2013

A Direct-attached storage (DAS) is a file-level computer data storage system directly attached to a server or workstation, without a storage network in between. The main protocols used for DAS connections are ATA, SATA, eSATA, SCSI, SAS, and Fibre Channel.

A storage area network (SAN) is a dedicated network that provides access to consolidated, block level data storage. A SAN does not provide file abstraction, only block-level operations.

A Redundant array of independent disks (RAID) is a storage technology that combines multiple disk drive components into a logical unit. Data is distributed across the drives in one of several ways called RAID levels, depending on what level of redundancy and what performance is required. In RAID 1, data is written identically to two drives, thereby producing a mirrored set.

A Network-attached storage (NAS) is a file-level computer data storage system connected to a computer network, providing data access to a heterogeneous group of clients. A NAS, in contrast to a SAN, uses file-based network sharing protocols such as NFS, SMB, CIFS or AFP. NAS systems are networked appliances which contain one or more hard drives, often arranged into logical, redundant storage containers or RAID arrays. The benefits of network-attached storage, compared to file servers, include faster data access, easier administration, and simple configuration.

The storage space can be comprised of USB or hard disk drives. A NAS can be sold with the drives included or without them. Like PCs, NAS units have memory and processors. With better processors and increased memory, you get better performance from a NAS, just as with PCs. Lunix is usually used as embedded operating system. Noise and security are two other important concerns.

In the last years, NAS devices have been gaining popularity in the home office (SOHO), because they are useful for more than just general centralized storage provided to client computers in environments with large amounts of data. Typical other usages are :

  • load balancing
  • email server
  • web server
  • media server
  • multimedia streaming
  • cloud-based backup
  • low-cost video surveillance
  • BitTorrent client

A list of NAS manufacturers is provided at Wikipedia. A list of links to additional informations about SAN’s is provided below :

The periodic table of chemical elements

A chemical element is a pure chemical substance consisting of one type of atom, distinguished by its atomic number, which is the number of protons in its nucleus. Atoms are build of particles.

Isotopes are atoms of the same element (same number of protons), but having different numbers of neutrons. Most naturally occurring elements (66 of 94) have more than one stable isotope. For example, there are three main isotopes of carbon. All carbon atoms have 6 protons in the nucleus, but they can have either 6, 7, or 8 neutrons. Since the mass numbers of these are 12, 13 and 14 respectively, the three isotopes of carbon are known as carbon-12 (12C), carbon-13 (13C), and carbon-14 (14C). Carbon in everyday life and in chemistry is a mixture of 12C, 13C, and a very small fraction of 14C atoms. Its presence in organic materials is the basis of the radiocarbon dating method to date archaeological, geological, and hydrogeological samples.

The tabular display of the chemical elements, organized on the basis of their atomic numbers, electron configurations, and recurring chemical properties is called the periodic table. Although precursors exist, Dmitri Mendeleev is generally credited with the publication, in 1869, of the first widely recognized periodic table.

Periodic table of chemical elements (Wikipedia)

As of 2012, the periodic table contains 118 confirmed chemical elements.The latest, ununseptium, has been identified in 2010. Of these 118 elements, 114 have been officially recognized and named by the International Union of Pure and Applied Chemistry (IUPAC). A total of 98 are known to occur naturally on earth. 80 of them are stable, while the others are radioactive, decaying into lighter elements. A detailed list of the 118 known chemical elements is available at Wikipedia.

The lightest of the chemical elements are hydrogen and helium, both created by the Big Bang nucleosynthesis during the first 20 minutes of the universe. They are by far the most abundant chemical elements in the universe. However, iron is the most abundant element making up the earth, and oxygen is the most common element in the earth’s crust.

Although all known chemical matter is composed of chemical elements, chemical matter itself constitutes only about 15% of the matter in the universe. The remainder is dark matter, a mysterious substance which is not composed of chemical elements.

When two distinct elements are chemically combined, with the atoms held together by chemical bonds, the result is termed a chemical compound. Two thirds of the chemical elements occur on earth only as compounds. Just six elements – carbon, hydrogen, nitrogen, oxygen, calcium, and phosphorus – make up almost 99% of the composition of a human body.

 

Pantheism and the Anthropic Principle

Pantheism symbols

Pantheism is the belief that everything composes an all-encompassing, immanent God, or that the Universe (or Nature) is identical with divinity. Pantheists thus do not believe in a personal god or a anthropomorphic god.

The Universal Pantheist Society and the World Pantheist Movement (WPM) are two organizations of people associated with pantheism.

The Copernican principle, named after Nicolaus Copernicus, states that the Earth is not the center of the universe. Copernicus was a Renaissance astronomer and the first person to formulate a comprehensive heliocentric cosmology. The Anthropic Principle was first raised by Brandon Carter in 1973 in reaction to the Copernican principle. Carter stated “Although our situation is not necessarily central, it is inevitably privileged to some extent”. The anthropic principle has given rise to some confusion and controversy, partly because the phrase has been applied to several distinct ideas.

The anthropic principle is related to the fundamental parameters, that is the dimensionless physical constants and the initial conditions for the Big Bang. Connections between physical constants that seem to be necessary for the existence of life in the universe are called the anthropic coincidences. Many examples of claimed anthropic coincidences can be found in the literature. The constants of nature seem to be extraordinarily fine-tuned for the production of life. Opponents to this theory argue that the universe is less fine-tuned than often claimed or that there is not one universe, but a whole infinite ensemble of universes with all possible fundamental parameters, the multiverse.

Particles, strings and M-Theory

In the physical science, a particle is a small localized object to which can be ascribed several physical properties such as volume or mass.

In particle physics, an elementary particle (or fundamental particle) is a particle not known to be made up of smaller particles. If an elementary particle truly has no substructure, then it is one of the basic building blocks of the universe from which all other particles are made.

The Standard Model of particle physics has 61 particles :

  • 2*3*3 (=18) quarks (fermions) with corresponding antiparticles (total 36)
  • 2*3 (=6) leptons (fermions) with corresponding antiparticles (total 12)
  • 1*8 gluons (bosons) without antiparticles (total 8)
  • 1 W boson with one corresponding antiparticle (total 2)
  • 1 Z boson without antiparticle (total 1)
  • 1 photon (boson) without antiparticle (total 1)
  • Higgs boson without antiparticle (total 1)

Standard model of particles (Wikipedia)

Quarks and leptons are fermions. According to the spin-statistics theorem, fermions respect the Pauli exclusion principle. Each fermion has a corresponding antiparticle.

Elementary fermions are matter particles, segmented as :

  • 6 Quarks : up, down, charm, strange, top, bottom
  • 3 Leptons with electrical charge : electron, muon, tau
  • 3 Leptons without electrical charge : electron neutrino,  muon neutrino,  tau neutrino

Pairs from each classification are grouped together to form a generation, with corresponding particles exhibiting similar physical behavior.

Elemetary bosons are force-carrying particles, segmented as :

Gluons have 8 color charges.

The elementary fermions and bosons are represented in the following scheme :

Standard Model of elementary particles (Wikipedia)

The Higgs particle is a massive scalar elementary particle without intrinsic spin.

Additional elementary particles may exist, such as the graviton, which would mediate gravitation. Such particles lie beyond the Standard Model.

Composite particles are hadrons, made of quarks, held together by the strong interaction (also called strong force). Hadrons are categorized into two families: baryons and mesons. Baryons are hadrons and fermions, mesons are hadrons and bosons.

Baryons are made of three valence quarks. The best-known baryons are the proton and the neutron that make up most of the mass of the visible matter in the universe. Both form together the atom,  a basic unit that consists of a dense central nucleus surrounded by a cloud of negatively charged electrons.

Each type of baryon has a corresponding antiparticle (antibaryon) in which quarks are replaced by their corresponding antiquarks. For example : just as a proton is made of two up-quarks and one down-quark, its corresponding antiparticle, the antiproton, is made of two up-antiquarks and one down-antiquark.

Mesons are hadronic subatomic particles composed of one quark and one antiquark, bound together by the strong interaction. Pions are the lightest mesons. A list of all mesons and a list of all particles are available at Wikipedia.

All particles of the Standard Model have been observed in nature, including the Higgs boson. Particles are described by the quantum field theory (quantum mechanics). String theory are an active research framework in particle physics that attempts to reconcile quantum mechanics and general relativity.  String theory posits that the elementary particles within an atom are not 0-dimensional objects, but rather 1-dimensional oscillating lines (strings). A key feature of string theory is the existence of D-branes. There are different flavors of the string theory. The version that incorporates fermions and supersymmetry is called superstring theory.

An extension of the superstring theory is the M-theory in which 11 dimensions are identified. According to Stephen Hawking in particular, M-theory is the only candidate for a complete theory of the universe, the theory of everything (TOE), a self-contained mathematical model that describes all fundamental forces and forms of matter.

Learning and the hebbian theory

Learning is acquiring new, or modifying existing, knowledge, behaviors, skills, values, or preferences and may involve synthesizing different types of information. The ability to learn is possessed by humans, animals and some machines. Progress over time tends to follow learning curves.

Three domains of learning have been proposed by Benjamin Bloom, an American educational psychologist who made contributions to the classification of educational objectives and to the theory of mastery-learning :

There are also numerous types of learning. Wikipedia lists 17 different types with several subtypes. More  types are proposed by other sources.

The adaptation of neurons in the brain during the learning process is explained by the
Hebbian theory,  a scientific theory in neurobiology. Introduced by Donald O. Hebb in 1949, it is also called Hebb’s rule, Hebb’s postulate, or cell assembly theory. Donald O. Hebb was a Canadian psychologist who sought to understand how the function of neurons contributed to psychological processes such as learning. He has been described as the father of neuropsychology and neural networks.

Supercomputers

Last update : August 6, 2013

Supercomputers were introduced in the 1960s and were designed primarily by Seymour Cray at Control Data Corporation (CDC), and later at Cray Research. While the supercomputers of the 1970s used only a few processors, in the 1990s, machines with thousands of processors began to appear and by the end of the 20th century, massively parallel supercomputers with tens of thousands of “off-the-shelf” processors were the norm.

ChipTest, Deep Thought and Deep Blue supercomputers

ChipTest, Deep Thought and Deep Blue were chess computers. The chess project was started at Carnegie Mellon University by Feng-hsiung Hsu in 1985. He and his collaborators were hired by IBM Research in 1989 to continue their work to build a chess machine that could defeat the world champion. On May 11, 1997, Deep Blue, with human intervention between games, won the second six-game match against world champion Garry Kasparov by two wins to one with three draws.

Blue Gene supercomputers

Blue Gene is an IBM project aimed at designing supercomputers that can reach operating speeds in the petaFLOPS range, with low power consumption. The initial design for Blue Gene was based on an early version of the Cyclops64 architecture, designed by Monty Denneau. The project created three generations of supercomputers, Blue Gene/L, Blue Gene/P, and Blue Gene/Q. In 2004, the first IBM Blue Gene computer became the fastest supercomputer in the world.

Watson supercomputers

Watson is an artificial intelligence computer system capable of answering questions posed in natural language, developed in IBM’s DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM’s first president, Thomas J. Watson. In 2011, as a test of its abilities, Watson competed on the quiz show Jeopardy!. Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage including the full text of Wikipedia, but was not connected to the Internet during the game.

IBM describes Watson as “an application of advanced Natural Language Processing, Information Retrieval, Knowledge Representation and Reasoning, and Machine Learning technologies to the field of open domain question answering“.  IBM’s DeepQA technology is used for hypothesis generation, massive evidence gathering, analysis, and scoring.

Watson is related to Artificial Intelligence and to the research of commonsense knowledge, the collection of facts and information that an ordinary person is expected to know.

Collective intelligence of ants and swarms

Last update : August 6, 2013

Collective intelligence, also called group wisdom, is shared knowledge arrived at by individuals and groups. The wisdom of the crowd is the process of taking into account the collective opinion of a group of individuals rather than a single expert to answer a question. James Surowiecki published published in 2004 his book The Wisdom of Crowds about the aggregation of information in groups, resulting in decisions that, he argues, are often better than could have been made by any single member of the group.

Group intelligence refers to a process by which large numbers of people simultaneously converge upon the same point(s) of knowledge.

Collective intelligence, which is sometimes used synonymously with collective wisdom, is more of a shared decision process than collective wisdom. Collective intelligence is a shared intelligence that emerges from the collaboration and competition of many individuals and appears in consensus decision making in animals, humans and computer networks. The term is related to the Global Brain.

If we look at ants, we can see that they exhibit many of the characteristics and behaviours that we associate with intelligence and civilization, for example :

  • ants build cities (ant hills) with contain complex ventilation systems, waste recycling and complex transportation systems including highways
  • ants farm and cultivate mushrooms
  • ants raise and keep other insects for food
  • ants wage wars in organized batallions
  • ants capture slaves
  • ants teach and communicate
  • ants collaborate and do teamwork

The study of the behavior of social insects like ants and bees is part of the Swarm Intelligence (SI). This is a relatively new discipline that deals with the study of self-organizing processes both in nature and in artificial systems. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, in the context of cellular robotic systems. Besides ant colonies, natural examples of SI include bird flocking, animal herding, bacterial growth and fish schooling. The application of swarm principles to robots is called swarm robotics, a special case is ant robotics. In computer science and operations research, the ant colony optimization algorithm (ACO) is used to find good paths through graphs.

collective intelligence of ants

ANTS2012 , September 2012 Brussels

A first workshop ANTS98 on Ant Colony Optimization, “From ant colonies to artificial ants”, took place in October 1998 in Brussels. The eight international conference ANTS2012 (in the meantime called Swarm Intelligence) took place in September 2012 in Brussels.

In 2006, the Center for Collective Intelligence (CCI) was created at MIT to make collective intelligence a topic of serious academic study. O’Reilly Media published in 2007 the book Programming Collective Intelligence, written by Toby Segaran.

Mammal and Human Brain Projects

Last update : August 6, 2013

Human Brain Project (2013)

The Human Brain Project (HBP) was submitted on 23 October 2012 for funding under the European Union’s FET Flagship program. FET (Future & Emerging Technologies) flagships are ambitious large-scale, science-driven, research initiatives that aim to achieve a visionary goal. On January 28, 2013, the European Commission has officially announced the selection of the Human Brain Project as one of its two FET Flagship projects.

The goal of the HBP is to understand and mimic the way the human brain works. The Blue Brain Project’s success has demonstrated the feasibility of the HBP general strategy.

The project will be coordinated by the École Polytechnique Fédérale de Lausanne (EPFL) and will be hosted at the NEUROPOLIS platform. The HBP team will include many of Europe’s best neuroscientists, doctors, physicists, mathematicians, computer engineers and ethicists. The leaders of the different sub-groups are : Universidad Politécnica de Madrid, Forschungszentrum Jülich GmbH, CEA, Le Centre national de la recherche scientifique, Karolinska Institutet, Centre hospitalier universitaire vaudois, Universität Heidelberg, Technische Universität München, Institut Pasteur. In total more than 120 teams in 90 scientific institutions from 22 countries will contribute to the HBP. A full list of partners and collaborators is presented at the HBP website. The HBP will be open by involving groups and individual scientists who are not members of the original consortium.This will be handled by the HBP Competitive Calls Programme.

The Human Brain Project has the potential to revolutionize technology, medicine, neuroscience, and society. It will drive the development of new technologies for supercomputing and for scientific visualization. Models of the brain will allow us to design computers, robots, sensors and other devices far more powerful, more intelligent and more energy efficient than any we know today. Brain simulation will help us understand the root causes of brain diseases, to diagnose them early, to develop new treatments, and to reduce reliance on animal testing. The project will also throw new light on questions human beings have been asking for more than two and a half thousand years. What does it mean to perceive, to think, to remember, to learn, to know, to decide? What does it mean to be conscious?

A video of the HBP is available at the Vimeo website.

The HBP is organized in thirteen subprojects :

Blue Brain Project (2005)

The Blue Brain Project is an attempt to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level. The aim of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne (EPFL), is to study the brain’s architectural and functional principles. The project is headed by the Institute’s director, Henry Markram.

Using an IBM Blue Gene supercomputer running Michael Hines‘s NEURON software, the simulation involves a biologically realistic model of neurons. There are numerous sub-projects run by universities and independent laboratories.

The current version 7.2 of NEURON is available as a cross-platform program under a GNU GPL licence from the universities Yale and Duke.

A ten-year documentary film-in-the-making about the race to reverse engineer the human brain is available at the Bluebrain Film website.

In the future the Blue Brain Project will be part of the Human Brain Project.

Brain Architecture Projects (2009)

The Brain Architecture Project is a collaborative effort aimed at creating an integrated resource containing knowledge about nervous system architecture in multiple species, with a focus on mouse and human. The Brain Architecture Project Principal Investigator is Partha P. Mitra, professor at the Cold Spring Harbor Laboratory (CSHL).

The goal of the Mouse Brain Architecture (MBA) Project is to generate brainwide maps of inter-regional neural connectivity. These maps will thus specify the inputs and outputs of every brain region, at a mesoscopic level of analysis corresponding to brain compartments defined in classical neuroanatomy.

The Human Brain Architecture Project includes several components related to the human brain : The Online Brain Atlas Reconciliation Tool (OBART), The Human Brain Connectivity Database and the Co-expression networks of genes related to addiction.

The Brain Architecture Team has also been working on two prototype systems (Text Mining) for information extraction (IE) of knowledge related to brain architecture from a large text corpus containing approximately 55,000 full-text journal articles.

Brain Reverse Engineering Lab (2011)

This project is headed by Witali L. Dunin-Barkowski, Head of the Department of Neuroinformatics at the Center for Optical Neural Technologies of the Scientific Research Institute for System Analysis of the Russian Academy of Sciences.

The main initial task of the laboratory will be the creation of open-access scientific, technological and engineering internet-resource in a form of a specialized database of knowledge on mechanisms of brain work. It is supposed that as a result of the planned work at the end of 2015 the project’s team will elaborate the full detailed description of the mechanisms of human brain. It will be possible to use this description to make in the following years a full scale working analog of the human brain, based on technological informational elements and devices.

FET flagship initiatives

Last update : August 9, 2013

In 2009, the European Commission presented the strategy for research on future and emerging technologies in Europe : Moving the ICT frontiers. One action point was the identification and launch of FET flagship initiatives.

“A FET flagship initiative could model and run large-scale simulations in order to understand the way nature processes information and to apply this knowledge to develop future biocomputers. Such a unique endeavour would attract the best computer scientists, biologists and physicists from Europe and beyond.”

The official launch of the FET Flagship Pilots took place at The European Future Technologies Conference and Exhibition (FET11) in May 2011 under the auspices of the Hungarian presidency in Budapest.

Six FET Flagship pilots have been funded to create a design and description of consolidated candidate FET Flagship Initiatives, including assessment of feasibility in scientific, technical and financial terms. These six pilots are listed below :

On January 29, 2013, Vice-President Neelie Kroes from the European Commission announced the two research projects chosen as winners of the FET Flagships initiative: Graphene and Human Brain Project.