fbpx

Source: This article was published phys.org - Contributed by Member: Logan Hochstetler

As scientific datasets increase in both size and complexity, the ability to label, filter and search this deluge of information has become a laborious, time-consuming and sometimes impossible task, without the help of automated tools.

With this in mind, a team of researchers from Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley is developing innovative machine learning tools to pull contextual information from scientific datasets and automatically generate metadata tags for each file. Scientists can then search these files via a web-based search engine for scientific , called Science Search, that the Berkeley team is building.

As a proof-of-concept, the team is working with staff at the Department of Energy's (DOE) Molecular Foundry, located at Berkeley Lab, to demonstrate the concepts of Science Search on the images captured by the facility's instruments. A beta version of the platform has been made available to Foundry researchers.

 

"A tool like Science Search has the potential to revolutionize our research," says Colin Ophus, a Molecular Foundry research scientist within the National Center for Electron Microscopy (NCEM) and Science Search Collaborator. "We are a taxpayer-funded National User Facility, and we would like to make all of the data widely available, rather than the small number of images chosen for publication. However, today, most of the data that is collected here only really gets looked at by a handful of people—the data producers, including the PI (principal investigator), their postdocs or graduate students—because there is currently no easy way to sift through and share the data. By making this raw data easily searchable and shareable, via the Internet, Science Search could open this reservoir of 'dark data' to all scientists and maximize our facility's scientific impact."

The Challenges of Searching Science Data

Today, search engines are ubiquitously used to find information on the Internet but searching  data presents a different set of challenges. For example, Google's algorithm relies on more than 200 clues to achieve an effective search. These clues can come in the form of keywords on a webpage, metadata in images or audience feedback from billions of people when they click on the information they are looking for. In contrast, scientific data comes in many forms that are radically different than an average web page, requires context that is specific to the science and often also lacks the metadata to provide context that is required for effective searches.

At National User Facilities like the Molecular Foundry, researchers from all over the world apply for time and then travel to Berkeley to use extremely specialized instruments free of charge. Ophus notes that the current cameras on microscopes at the Foundry can collect up to a terabyte of data in under 10 minutes. Users then need to manually sift through this data to find quality images with "good resolution" and save that information on a secure shared file system, like Dropbox, or on an external hard drive that they eventually take home with them to analyze.

Oftentimes, the researchers that come to the Molecular Foundry only have a couple of days to collect their data. Because it is very tedious and time-consuming to manually add notes to terabytes of scientific data and there is no standard for doing it, most researchers just type shorthand descriptions in the filename. This might make sense to the person saving the file but often doesn't make much sense to anyone else.

"The lack of real metadata labels eventually causes problems when the scientist tries to find the data later or attempts to share it with others," says Lavanya Ramakrishnan, a staff scientist in Berkeley Lab's Computational Research Division (CRD) and co-principal investigator of the Science Search project. "But with machine-learning techniques, we can have computers help with what is laborious for the users, including adding tags to the data. Then we can use those tags to effectively search the data."

 

To address the metadata issue, the Berkeley Lab team uses machine-learning techniques to mine the "science ecosystem"—including instrument timestamps, facility user logs, scientific proposals, publications and file system structures—for contextual information. The collective information from these sources including the timestamp of the experiment, notes about the resolution and filter used and the user's request for time, all provide critical contextual information. The Berkeley lab team has put together an innovative software stack that uses machine-learning techniques including natural language processing pull contextual keywords about the scientific experiment and automatically create metadata tags for the data.

For the proof-of-concept, Ophus shared data from the Molecular Foundry's TEAM 1 electron microscope at NCEM that was recently collected by the facility staff, with the Science Search Team. He also volunteered to label a few thousand images to give the machine-learning tools some labels from which to start learning. While this is a good start, Science Search co-principal investigator Gunther Weber notes that most successful machine-learning applications typically require significantly more data and feedback to deliver better results. For example, in the case of search engines like Google, Weber notes that training datasets are created and machine-learning techniques are validated when billions of people around the world verify their identity by clicking on all the images with street signs or storefronts after typing in their passwords, or on Facebook when they're tagging their friends in an image.

Berkeley Lab researchers use machine learning to search science data
This screen capture of the Science Search interface shows how users can easily validate metadata tags that have been generated via machine learning or add information that hasn't already been captured. Credit: Gonzalo Rodrigo, Berkeley Lab

"In the case of science data only a handful of domain experts can create training sets and validate machine-learning techniques, so one of the big ongoing problems we face is an extremely small number of training sets," says Weber, who is also a staff scientist in Berkeley Lab's CRD.

To overcome this challenge, the Berkeley Lab researchers used to transfer learning to limit the degrees of freedom, or parameter counts, on their convolutional neural networks (CNNs). Transfer learning is a machine learning method in which a model developed for a task is reused as the starting point for a model on a second task, which allows the user to get more accurate results from a smaller training set. In the case of the TEAM I microscope, the data produced contains information about which operation mode the instrument was in at the time of collection. With that information, Weber was able to train the neural network on that classification so it could generate that mode of operation label automatically. He then froze that convolutional layer of the network, which meant he'd only have to retrain the densely connected layers. This approach effectively reduces the number of parameters on the CNN, allowing the team to get some meaningful results from their limited training data.

Machine Learning to Mine the Scientific Ecosystem

In addition to generating metadata tags through training datasets, the Berkeley Lab team also developed tools that use machine-learning techniques for mining the science ecosystem for data context. For example, the data ingest module can look at a multitude of information sources from the scientific ecosystem—including instrument timestamps, user logs, proposals, and publications—and identify commonalities. Tools developed at Berkeley Lab that uses natural language-processing methods can then identify and rank words that give context to the data and facilitate meaningful results for users later on. The user will see something similar to the results page of an Internet search, where content with the most text matching the user's search words will appear higher on the page. The system also learns from user queries and the search results they click on.

 

Because scientific instruments are generating an ever-growing body of data, all aspects of the Berkeley team's science search engine needed to be scalable to keep pace with the rate and scale of the data volumes being produced. The team achieved this by setting up their system in a Spin instance on the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC). Spin is a Docker-based edge-services technology developed at NERSC that can access the facility's high-performance computing systems and storage on the back end.

"One of the reasons it is possible for us to build a tool like Science Search is our access to resources at NERSC," says Gonzalo Rodrigo, a Berkeley Lab postdoctoral researcher who is working on the natural language processing and infrastructure challenges in Science Search. "We have to store, analyze and retrieve really large datasets, and it is useful to have access to a supercomputing facility to do the heavy lifting for these tasks. NERSC's Spin is a great platform to run our search engine that is a user-facing application that requires access to large datasets and analytical data that can only be stored on large supercomputing storage systems."

An Interface for Validating and Searching Data

When the Berkeley Lab team developed the interface for users to interact with their system, they knew that it would have to accomplish a couple of objectives, including effective search and allowing human input to the machine learning models. Because the system relies on domain experts to help generate the training data and validate the machine-learning model output, the interface needed to facilitate that.

"The tagging interface that we developed displays the original data and metadata available, as well as any machine-generated tags we have so far. Expert users then can browse the data and create new tags and review any machine-generated tags for accuracy," says Matt Henderson, who is a Computer Systems Engineer in CRD and leads the user interface development effort.

To facilitate an effective search for users based on available information, the team's search interface provides a query mechanism for available files, proposals and papers that the Berkeley-developed machine-learning tools have parsed and extracted tags from. Each listed search result item represents a summary of that data, with a more detailed secondary view available, including information on tags that matched this item. The team is currently exploring how to best incorporate user feedback to improve the models and tags.

"Having the ability to explore datasets is important for scientific breakthroughs, and this is the first time that anything like Science Search has been attempted," says Ramakrishnan. "Our ultimate vision is to build the foundation that will eventually support a 'Google' for scientific data, where researchers can even  distributed datasets. Our current work provides the foundation needed to get to that ambitious vision."

"Berkeley Lab is really an ideal place to build a tool like Science Search because we have a number of user facilities, like the Molecular Foundry, that has decades worth of data that would provide even more value to the scientific community if the data could be searched and shared," adds Katie Antypas, who is the principal investigator of Science Search and head of NERSC's Data Department. "Plus we have great access to machine-learning expertise in the Berkeley Lab Computing Sciences Area as well as HPC resources at NERSC in order to build these capabilities."

Categorized in Online Research

While 2016 saw its share of chaos, it also produced some outstanding brain science and psych research. This list isn’t meant to be exhaustive (and it's not in any particular order), but is rather a curation of great studies covered here at Neuropsyched. It's also a preview of things to come in the new year for several topics—depression, sleep, pot, stress and memory among them.

Marijuana Compounds Show Promise Against Alzheimer’s   

Researchers at the Salk Institute discovered in 2016 that the main psychoactive compound in marijuana—tetrahydrocannabinol (THC)—and a few other active compounds remove amyloid beta proteins from lab-grown neurons. Amyloid is the toxic protein known to accumulate in the brains of Alzheimer's patients. The compounds also significantly reduced cellular inflammation in the brain, an underlying factor in the disease's progression. While preliminary, the research is an example of what may be gained by studying potential effects of marijuana compounds, and why it's vital we keep the research door open. Definitely more to come on this in 2017.

 

Your Brain’s Capacity Is 10 Times Greater Than Anyone Realized

We credit our brains with a lot of storage capacity and processing power, but research from 2016 hinted that we’ve been nowhere close to estimating their actual capacity. The study showed that the human brain has at least as much capacity as the entire World Wide Web (that's about ten times as much as previously thought), and it could turn out to be more. It’s all about the amazing computing power packed into synapses, the juncture points between neurons, which change in size and shape with more frequency and variation than anyone realized before now, and it’s that uncanny flexibility that holds the key to our vast neural resources. Quoting study co-senior author Terry Sejnowski, “This is a real bombshell in the field of neuroscience.”

Painkillers May Make Chronic Pain Worse

In the unintended consequences category, a study showed that just five days of morphine treatment in rats caused chronic pain that continued for several months by triggering the release of pain signals from cells in the brain and spinal cord. If the findings hold true in humans, they’d help explain the vicious cycle of prescription opioid use. The drugs numb pain at the surface level, but below the surface they may be drawing out how long a patient experiences pain, thereby extending how long the drugs are taken. Since opioid addiction can begin after a relatively short period of time, it’s easy to see how this effect could be contributing to the epidemic of painkiller addiction that's been building for the last 15 years.

Why Sugar Dependency Is Such A Hard Habit To Break

Research in 2016 deconstructed how habits rewire the brain, with one in particular showing that neural “stop” and “go” signals are reversed by habitual exposure to sugar. Not unlike drug addiction, sugar dependency changes how the brain controls electrical signals linked to either pursuing a reward or putting the brakes on the pursuit. The implication is that sugar cravings aren’t just a matter of appetite, but the result of brain changes brought about by habitual exposure to a potently addictive chemical. This is yet more evidence that we’ve been underestimating the effects of sugar for too long. (Another study from the year showed how fructose damages genes underlying memory.)

 

Finding Genetic Links To Happiness And Depression

One of the largest studies to date seeking genetic links to mood found convincing evidence that how we psychologically experience the world has roots in the genome. More than 190 researchers in 17 countries analyzed genomic data from nearly 300,000 people. The results zeroed in on a handful of genetic variants linked to subjective well-being—the thoughts and feelings we have about the quality of our lives, which psychologists define as a central component of happiness. Other variants were found with links to depression and neuroticism. The next big questions include how these variants interact with our environments, and if depression can be genetically revealed before developing into a full-blown disorder.

First Step Toward A Preventative Alzheimer's Pill

Research in 2016 opened the door to an eventual preventative medication against Alzheimer's, and potentially also other neuro-degenerative diseases like Parkinson's. Scientists from the Baylor College of Medicine, Texas Children’s Hospital and Johns Hopkins University School of Medicine targeted ways of reducing the amount of toxic proteins that accumulate over years in the brains of those who subsequently develop these diseases, specifically the tau protein that's been strongly linked to the development of Alzheimer's. The research is a shift in focus, as most Alzheimer’s studies have concentrated on the later stages of the disease. But in the last several years mounting evidence has pointed to Alzheimer’s developing over the course of decades, which opens the possibility of slowing its progression before irreversible damage is done to a patient’s brain later in life. This study marks a definitive step forward in the treatment of a disease that affects one in every nine people over the age of 65.

How Sleep Apnea Changes The Brain

While it’s difficult to choose a single sleep research study from the year, one in particular stands out to me because it uncovered more precisely the effects of sleep apnea on the brain. Apnea is a growing concern for several reasons, its link to stroke, depression and traffic accidents among them. This study showed how restless nights of interrupted breathing trigger a chemical rollercoaster in the brain by throwing off the neurotransmitters GABA and glutamate. The results, common to apnea sufferers, include a heightened response to stress, lack of concentration and feeling like emotions are teetering on the proverbial cliff. More to come on this as sleep research continues its ascent.

 

Walking Is Deceptively Simple Brain Medicine

In the practical science category, research reinforced the importance of simply taking a walk for a positive brain boost. Among a stack of studies supporting the argument, one from 2016 focused on how walking improves mood even when we’re not expecting any effect. Researchers conducted three experiments on hundreds of people to find out if they’d experience a positive mood boost while walking, without knowing that walking could be the reason. They found that just 12 minutes of walking resulted in an increase in joviality, vigor, attentiveness and self-confidence versus the same time spent sitting. The importance here is to underscore a basic point: some of the best brain tools available to us don’t require money, special training or seeing a doctor. They just require moving.

Facebook's Effect On How The Brain Manages Relationships

Much of the psych research about Facebook has focused on whether it’s a mood enhancer or depression trigger, and you can find studies from 2016 supporting both arguments. The study I’m more interested in asked whether Facebook is changing how we manage relationships. Theoretically, a social media tool that allows us to expand our reach to thousands of people could enable us to turn a corner, cognitively speaking, and go beyond the constraints that have kept human social groups relatively small for centuries. Or not. Maybe a few decades from now we’ll have a different answer, but for the moment it seems that despite big social media numbers, our brains are still calibrated to handle right around 150 overall relationships, and a much smaller number of close relationships. Dunbar's Number holds.

Old-Time Memory Hacks Are Still The Best

Finally, in the rage-against-the-digital-machine category, I really liked a study from 2016 showing why “reminders through association” (or “cue-based reminders”) work so well. It’s all about simple time and place proximity, according to the researchers, and none of the memory hacks require a computer of any sort to work. Crumpled paper, paperclips and well-placed envelopes do the trick darn near flawlessly. As our lives become more complex and stressful, practical science like this becomes more essential.

Source: This article was forbes.com By David DiSalvo

Categorized in Internet Technology

Nobody likes layovers, but the first astronauts heading to Mars will get to experience one of the longest such experiences of their lives. They’ll have to spend one year going around the moon, which will probably be a very annoying wait for the first people heading to the red planet. It’s not all bad news, however, as they won’t just wait for time to pass by. NASA actually wants to make sure that the round trip to Mars, a 1,000-day endeavor, is carefully planned during the time. 

 

NASA’s Greg Williams, revealed that the agency’s Phase 2 of its plan to send humans to Mars includes a one-year layover in orbit around the moon in the late 2020s, Space reports..

Williams, NASA’s deputy associate administrator for policy and plans at the Human Exploration and Operations Mission Directorate, revealed that NASA wants to build a “deep-space gateway” around the moon that would serve as the testing ground for the first Mars missions.

The moon orbit base would also serve as the staging point for the mission, and the spacecraft that will carry humans to Mars for the first time eve will be launched from the moon.

“If we could conduct a yearlong crewed mission on this Deep Space Transport in cislunar space, we believe we will know enough that we could then send this thing, crewed, on a 1,000-day mission to the Mars system and back,” Williams said.

Considering the length of the Mars trip, spending a year around the moon to make sure everything works correctly makes plenty of sense.

 

NASA will kick off its Mars mission with Phase 1, between 2018 and 2026. During this time, the agency will send four missions to the moon that would deliver various components needed for the mission. Phase 2 will begin in 2027, with an uncrewed mission that would deliver the Deep Space Transport vehicle to the cislunar space.

The actual trip to Mars will take place in the 2030s, as shown in the following infographic.

Image Source: NASA/The Humans to Mars Summit

Source: This article was published BGR News By Chris Smith

Categorized in Internet Technology

Earth is a pretty nifty place. I mean, I’ve spent my entire life here and I’m guessing you have, too, and there’s plenty to see and do, but why is it here at all? For a long time, researchers have tried to answer that question with varying degrees of success, but a new theory of how Earth formed is gaining traction, and it might be the explanation we’ve been looking for.

 

The most widely-accepted explanation for how Earth and most terrestrial plants formed hinges on materials orbiting a newborn star — in this case, our sun — which bunched up and formed planets. It’s a fine theory, but some researchers have grown increasingly skeptical that the materials that make up our planet, which is rocky and iron-rich, could have stuck together on their own.

A new idea, introduced by Alexander Hubbard, a Ph.D. in Astronomy who now works with the American Museum of Natural History, turns to the sun for an explanation. Hubbard has proposed that the sun went through a period of intense volatility in which essentially roasted much of the material in its immediate vicinity, stretching as far as Mars. The softened materials would have been the right consistency to bunch up and form planets, and would explain why the rocky worlds of Mercury, Venus, Earth and Mars sprung up.

 

Hubbard’s theory isn’t just a random guess; He’s basing the idea on observed behavior of an infant star which went through a phase just like the one he’s proposing of our own sun. FU Orionis was first observed rapidly brightening in 1936 and at present it shines over 100 times brighter than it did when originally observed. If our own sun pulled the same trick in its early life it could have been exactly what was needed to form our planet.

Source: This article was published on bgr.com by Mike Wehner


Categorized in Internet Technology

The UK’s internet habits have changed dramatically over the last decade; according to a recent Ofcom report, 50 million of us are internet users, with the average person now spending 25 hours a week online.

While the internet plays an integral part in our daily lives, many of us have little understanding of how it actually works. This video outlines the very basics of the internet, explaining the process of transmission control protocol and how the common structure of a URL is broken down.

This is the fifth episode in our How It Works series that discusses the basics of everyday technology.

 

Future episodes in this series will look at the inner workings of Wi-Fi and a digital camera, while last week’s episode explored the science behind how a touchscreen works.

 

Source : telegraph.co.uk

Categorized in Internet Technology

PALO ALTO, CA--(Marketwired - Mar 7, 2017) - Bioz, Inc., developers of the world's first search engine for life science experimentation, today launched the next-generation of its patent-pending search engine platform. The updated cloud technology includes a new user interface, extensive coverage of scientific articles, a new vendor partner program, and a new level of quality and accuracy of search results, while also offering deep insights on how to best use life science products in experiments.

"Bioz has quickly disrupted and modernized life science research by structuring scientific knowledge to truly guide researchers when conducting their experiments. Ultimately, this aids in advancing scientific research and speeding up the rate of drug discovery," said Daniel Levitt, co-founder and CEO of Bioz. "Today, over 200,000 biopharma company and university researchers from just about every country in the world rely on our industry-changing cloud-based platform, enabling life science researchers to work faster, smarter and more cost-effectively."

 

The Bioz search engine taps the latest advances in Natural Language Processing (NLP) and Machine Learning (ML) to mine and structure hundreds of millions of pages of complex and unstructured scientific papers. This places an unprecedented amount of summarized scientific experimentation knowledge at researchers' fingertips.

Bioz Stars, part of the Bioz platform, provides unbiased and objective algorithmic ratings that are displayed for over 200 million life science products, tools, reagents, lab equipment, instruments, assays and kits. Each product is listed with its own algorithmically determined objective Bioz Star rating. The number of Bioz Stars assigned to a product indicates how well that product is likely to perform in a researcher's experiment. The Bioz Star ratings are calculated solely from objective parameters, and are optimized to serve the research community in choosing the right product for their next experiment.

Complementing each product's Bioz Star rating is a detailed set of product usage insights, guiding researchers on how to best use the selected products in their specific experiments. Product usage guidance is provided to users in the form of over one billion structured and objective data points relating to assay protocol conditions, including: dilution, temperature, time and concentration, among many others.

"Bioz empowers the end user to identify, select and evaluate the optimal reagents and tools enabling rapid biomarker assay development. The platform has built-in features for interrogating data for: reagent selection, supplier ratings, tested applications, user feedback, reviews and publications and also health authority guidance," said Dr. Akash Datwani, Scientist and Clinical Development and Therapeutic Area leader of Genentech (Roche). "We are eager to explore the newly released Bioz platform, with its enhanced UI, expanded corpus and assay-specific protocols. These much-needed features have the promise to unveil available tools and reagents to accelerate discovering and delivering better and safer medicines to patients."

 

The next-generation Bioz platform, available free to researchers and scientists today, includes:

A new user interface: The new platform displays more results in each page, and has a highly intuitive UI and UX focusing on helping researchers quickly find the products they need.

New vendor partner program: The new Bioz vendor program streamlines product procurement for researchers, while also providing them with the latest information on new life science products and services. Vendors will gain access to lead generation and branding opportunities.

A faster "brain" for search quality: Users will benefit from much faster performance via greater sophistication in product synonym and acronym matching across multiple articles.

A smarter "brain" for high-quality search results: Users will also benefit from high-quality search results that are structured using hundreds of sophisticated algorithmic rules that collect, correlate and analyze product data from millions of articles.

New deeper structuring: Product results are now structured such that each product is matched with specific assays and the protocol conditions relevant to those assays. This helps researchers to not only identify the best products to use, but to also decide how to best work with each product in their specific assays. These deep data insights are based on the Bioz platform analyzing and summarizing what has worked for other researchers, as detailed within millions of peer-reviewed life science articles.

More data: An incredible 26 million scientific articles are now available to aid researchers. Moreover, with the new platform, Bioz is updating its article corpus in real-time so that researchers have access to the very latest material.
"We are thrilled to announce our next-generation Bioz platform, offering researchers from around the world an unparalleled resource for advanced objective ratings information and product usage guidance that is based on the source they most trust, scientific articles," said Dr. Karin Lachmi, co-founder, chief scientific officer and president of Bioz. "Bioz is the industry's only life science platform to offer these key information sets, which are necessary for successful experimentation and research. Thus, Bioz' value proposition is focused not only on 'what to buy,' but also on 'how to use it,' which facilitates much faster and better life science research and drug discovery processes that we can all benefit from."

The news follows on the heels of Frost & Sullivan's recent recognition of Bioz, Inc., with its 2017 North American New Product Innovation Award win. The award recognizes the value-added features and benefits and also increased return on investment (ROI) that the Bioz technology offers customers, which in turn raises Bioz customer acquisition and overall market penetration potential.

Source : http://finance.yahoo.com/news/bioz-launches-next-generation-industrys-140000530.html

 

 

 

Categorized in Search Engine

A researcher in Russia has made more than 48 million journal articles - almost every single peer-reviewed paper every published - freely available online. And she's now refusing to shut the site down, despite a court injunction and a lawsuit from Elsevier, one of the world's biggest publishers.

For those of you who aren't already using it, the site in question is Sci-Hub, and it's sort of like a Pirate Bay of the science world. It was established in 2011 by neuroscientist Alexandra Elbakyan, who was frustrated that she couldn't afford to access the articles needed for her research, and it's since gone viral, with hundreds of thousands of papers being downloaded daily. But at the end of last year, the site was ordered to be taken down by a New York district court - a ruling that Elbakyan has decided to fight, triggering a debate over who really owns science. 

 

"Payment of $32 is just insane when you need to skim or read tens or hundreds of these papers to do research. I obtained these papers by pirating them," Elbakyan told Torrent Freak last year."Everyone should have access to knowledge regardless of their income or affiliation. And that’s absolutely legal."

If it sounds like a modern day Robin Hood struggle, that's because it kinda is. But in this story, it's not just the poor who don't have access to scientific papers - journal subscriptions have become so expensive that leading universities such as Harvard and Cornell have admitted they can no longer afford them. Researchers have also taken a stand - with 15,000 scientists vowing to boycott publisher Elsevier in part for its excessive paywall fees.

Don't get us wrong, journal publishers have also done a whole lot of good - they've encouraged better research thanks to peer review, and before the Internet, they were crucial to the dissemination of knowledge.

But in recent years, more and more people are beginning to question whether they're still helping the progress of science. In fact, in some cases, the 'publish or perish' mentality is creating more problems than solutions, with a growing number of predatory publishers now charging researchers to have their work published - often without any proper peer review process or even editing.

"They feel pressured to do this," Elbakyan wrote in an open letter to the New York judge last year. "If a researcher wants to be recognised, make a career - he or she needs to have publications in such journals."

That's where Sci-Hub comes into the picture. The site works in two stages. First of all when you search for a paper, Sci-Hub tries to immediately download it from fellow pirate database LibGen. If that doesn't work, Sci-Hub is able to bypass journal paywalls thanks to a range of access keys that have been donated by anonymous academics (thank you, science spies).

This means that Sci-Hub can instantly access any paper published by the big guys, including JSTOR, Springer, Sage, and Elsevier, and deliver it to you for free within seconds. The site then automatically sends a copy of that paper to LibGen, to help share the love. 

 

 

It's an ingenious system, as Simon Oxenham explains for Big Think:

"In one fell swoop, a network has been created that likely has a greater level of access to science than any individual university, or even government for that matter, anywhere in the world. Sci-Hub represents the sum of countless different universities' institutional access - literally a world of knowledge."

That's all well and good for us users, but understandably, the big publishers are pissed off. Last year, a New York court delivered an injunction against Sci-Hub, making its domain unavailable (something Elbakyan dodged by switching to a new location), and the site is also being sued by Elsevier for "irreparable harm" - a case that experts are predicting will win Elsevier around $750 to $150,000 for each pirated article. Even at the lowest estimations, that would quickly add up to millions in damages.

But Elbakyan is not only standing her ground, she's come out swinging, claiming that it's Elsevier that have the illegal business model.

"I think Elsevier’s business model is itself illegal," she told Torrent Freak,referring to article 27 of the UN Declaration of Human Rights, which states that"everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits".

She also explains that the academic publishing situation is different to the music or film industry, where pirating is ripping off creators. "All papers on their website are written by researchers, and researchers do not receive money from what Elsevier collects. That is very different from the music or movie industry, where creators receive money from each copy sold," she said.

Elbakyan hopes that the lawsuit will set a precedent, and make it very clear to the scientific world either way who owns their ideas.

"If Elsevier manages to shut down our projects or force them into the darknet, that will demonstrate an important idea: that the public does not have the right to knowledge," she said. "We have to win over Elsevier and other publishers and show that what these commercial companies are doing is fundamentally wrong."

To be fair, Elbakyan is somewhat protected by the fact that she's in Russia and doesn't have any US assets, so even if Elsevier wins their lawsuit, it's going to be pretty hard for them to get the money.

Still, it's a bold move, and we're pretty interested to see how this fight turns out - because if there's one thing the world needs more of, it's scientific knowledge. In the meantime, Sci-Hub is still up and accessible for anyone who wants to use it, and Elbakyan has no plans to change that anytime soon.

Author : FIONA MACDONALD

Source : http://www.sciencealert.com/this-woman-has-illegally-uploaded-millions-of-journal-articles-in-an-attempt-to-open-up-science?action_object_map=%5B887220361395454%5D&action_ref_map=%5B%5D&action_type_map=%5B%25252525252525252525252522og.shares%25252

Categorized in Online Research

(NaturalNews) After witnessing how Reuters just blatantly cooked the presidential election polls this week to favor Clinton and how the mainstream media is so terrifyingly biased in favor of Clinton that the very foundation of democracy is now in crisis, it's time to tell you something that perhaps a lot more people are finally ready to hear:

EVERYTHING IS RIGGED.

Every institution in America is sold out, corrupted and politically rigged to favor Big Government and Big Business. "America is a lost country," explains Paul Craig Roberts. "The total corruption of every public and the private institution is complete. Nothing remains but tyranny. And lies. Endless lies."

CNN, Reuters and the Associated Press are all now shameless promoters of every big lie across every sector of society, from vaccines and GMOs to elections and politics. The federal government itself is incapable of doing anything other than lying, and it has totally corrupted the entire realm of science by pulling the strings of funding via the National Institutes of Health and the NSF.

 



The FDA is entirely corrupt, as is the USDA. Both function now as little more than marketing propaganda pushers for Big Pharma and Big Biotech. Similarly, Google, Facebook and Twitter are all rigged, too, censoring the voices they don't want anyone to hear while highlighting the establishment lies they wish to promote.

Here's what "rigged" really means... the tools of tyranny

When I say "everything is rigged," what does that mean, exactly?

  • All "official sources" are ordered to constantly lie about everything, weaving illusions to push a chosen narrative rooted in fiction (from "there are no Islamic terrorists" to "carbon dioxide is poison to the planet").
  • All voices of reason and sanity are silenced. Only the most insane, irrational voices are allowed to be magnified through any media (including social media). This is also true across the sciences, where real science has been all but snuffed out by political agendas (biosludge, GMOs, glyphosate, mercury in dentistry, etc.).
  • All facts are obliterated by propaganda. Facts have no place in any debate, and those who invoke facts are shamed and silenced (or even fired from their jobs, expelled from their schools or bullied into a state of suicide on social media). Anyone who invokes facts on things like the actual statistics of police shootings is told they are "part of the problem" because they have the "wrong attitude" about social justice.
  • Every branch of government is weaponized against the people and used as an assault tool against political enemies who threaten the status quo. (IRS, FDA, FTC, DEA, EPA, USDA, etc.)
  • All science is distorted into absurd, politically-motivated conclusions about everything the government wants to use to control the masses: Vaccines, climate change, GMOs, fluoride, flu shots, chemical agriculture, carbon dioxide and so on.
  • Every branch of medicine is hijacked by globalist agendas to make sure medicine never makes anyone healthier, more alert or more cognitively capable of thinking for themselves.
  • Every "news item" that's reported from any official source is deliberately distorted to the point of insanity, turning many facts on their heads while attacking anyone who might offer something truly constructive to the world. (Such as reporting that Clinton was "cleared" by the FBI when, in fact, she was indicted by the very facts the FBI presented!)
  • All voices of truth are silenced, then replaced by meaningless, distracting babble (Kardashians) or meaningless, tribal sports competitions (the Rio Olympics). The point is to dumb down the entire population to the point of cultural lunacy.
  • Any true reports that contradict any official narrative are immediately censored. For example, radio host Michael Savage just got blocked by Facebook for posting a true story about an illegal alien who committed murder in America.
  • Emotions are used as weapons to manipulate the masses. For example, when the mom of a Benghazi victim shares her grief with the world, she is ridiculed and shamed. But when a radical Muslim father who's trying to bring Sharia Law to America attacks Trump by expressing his loss of his soldier son, the media turns him into an instant celebrity, praising his "courageous voice" for daring to speak out. The media hypocrisy is enough to make you vomit...

What exactly is rigged?

  • The entire mainstream media
  • Google search engine and Google News
  • Facebook and Twitter
  • The DNC and the RNC (both 100% rigged by globalists)
  • Every federal agency (EPA, FDA, etc.)
  • The entire justice system (makes a total farce of real justice)
  • Interest rates and the value of the money supply (central banksters)
  • Academia (all public universities)
  • EPA's "safe" limits on pesticides (all rigged by Big Biotech)
  • Food and food labeling (all run by corrupt food companies)
  • Public education (rigged into Common Core anti-knowledge idiocy)
  • Banking and finance (all controlled by globalists)
  • Government economics figures and statistics
  • Medicine and pharmaceuticals (rigged to maximize profits)
  • Big Science (totally rigged by government agenda pushers)
  • The music industry (most top singers can't sing at all)
  • Weapons manufacturers and war corporations
  • The illegal drug trade (it's run by the government)
  • Political elections (all 100% rigged at the federal level)
  • Political polls (now rigged by Reuters, too)
  • The health insurance industry (rigged by Obamacare)
  • College admissions (legally discriminates against Whites and Asians)
  • 9/11 and domestic terrorism (all rigged "official stories")
  • Oil and energy industries
  • The rule of law (rigged in favor of the rich and powerful)
  • Infectious disease and the CDC (a constant stream of lies)
  • Hollywood (all run by globalists)
  • Climate change science (all a grand science hoax)
  • Press release services (they only allow official narratives)
  • History (what you are taught is mostly a lie)
  • Government grants (only given out to those who further the agenda)
  • Government bids (only awarded to those who kick back funds to corrupt officials)
  • Consciousness and free will (we are all taught consciousness doesn't exist)
  • Ethnobotany (medicinal and spiritual use of healing plants)
  • Life on other plants (the obvious truth is kept from us all)
  • The origin of the universe (the official narrative is a laughable fairy tale)

As a fantastic example of how everything is rigged, consider these paragraphs from this Breitbart.com news story published today:

 



Over the weekend and for the past few days since Khan spoke alongside his wife Ghazala Khan about their son, U.S. Army Captain Humayun Khan, who was killed in Iraq in 2004, media-wide reporters, editors, producers, and anchors have tried to lay criticism on Trump over the matter. They thought they had a good one, a specific line of attack that pitted Trump against the military—and supposedly showed him as a big meanie racist in the process.

But, as Breitbart News showed on Monday midday, that clearly was not the case. Khizr Khan has all sorts of financial, legal, and political connections to the Clintons through his old law firm, the mega-D.C. firm Hogan Lovells LLP. That firm did Hillary Clinton’s taxes for years, starting when Khan still worked there involved in, according to his own website, matters “firm wide”—back in 2004. It also has represented, for years, the government of Saudi Arabia in the United States. Saudi Arabia, of course, is a Clinton Foundation donor which—along with the mega-bundlers of thousands upon thousands in political donations to both of Hillary Clinton’s presidential campaigns in 2008 and 2016—plays right into the “Clinton Cash” narrative.

America's transformation into Communist China is nearly complete

If you're pondering where all this is headed, look no further than Communist China, where all independent news has been outlawed by the state. Political prisoners across China have their organs harvested to enrich black market organ traders, and nearly one out of every three urban citizens is a secret spy who snitches on friends for the totalitarian communist government.

Hillary Clinton is the embodiment of Communist Chinese totalitarianism. She's such a perfect fit for their disastrous model of human rights abuses, government corruption and systemic criminality that I'm surprised she doesn't live in Beijing. If Clinton gets elected, America is gone forever, replaced by a criminal regime of totalitarians who violate the RICO Act as a matter of policy.

If this entire rigged system of biased media, Facebook censorship, Google search result manipulations and twisted science ends up putting America's most terrifying political criminal into the White House, it's lights out for the American we once knew. Almost immediately, the nation fractures into near Civil War, with calls for secession growing unstoppable as state after state seeks to escape the political wrath of an insane regime of D.C. criminals and tyrants. #TEXIT

 

We now live in two Americas: Half the country is tired of everything being rigged, and the other half can't wait to be exploited by yet another crooked leftist LIAR who rigs everything

America is now essentially two nations. On one hand, we have the pro-Trump America, filled with people who are tired of being cheated, censored, punished, stolen from and lied to about everything under the sun. Donald Trump supporters are people who realize everything is rigged... and they're demanding an end to the corruption and criminality of the fascist system under which we all suffer today.

Hillary Clinton supporters are people who are too busy chasing political rainbows to realize everything is rigged. They still believe the lies and the propaganda (the "hope and change" that never came, but is still promised by empty politicians). They're living in fairy tale delusional worlds that have been woven into their gullible minds by the skillful social engineers of the radical left. These people still think the government cares about them... or that CNN only reports truthful news. They can't wait to see another globalist in the White House because they are pathetic, weak-minded empty shells of non-consciousness who are wholly incapable of thinking for themselves.

These two camps of Americans can no longer coexist. They have almost nothing in common when it comes to knowledge, wisdom, ethics, morals or philosophy. One camp believes in the rule of law (Trump); the other camp believes that people in power should be above the law (Clinton). One camp believes in states' rights and individual liberty (Trump) while the other camp believes in the consolidation of totalitarian power in the hands of a centralized, domineering government (Clinton). One camp believes in a level playing field, free market competition and rewarding innovation and hard work (Trump), while the other camp believes in free handouts, government "equality" mandates, and the ludicrous idea that "there should be no winners or losers in society." (Clinton)

In other to try to win this election, the Clinton camp has already rigged EVERYTHING from the very start, including the coronation of Hillary, the scheduling of televised debates to minimize their viewership, the surrender of Bernie Sanders to the DNC machine, the mass organization of illegal voting schemes to make sure illegal aliens vote in November, and so much more. No doubt they're also working extremely hard to rig the black box voting machines all across the country.



If you're tired of everything being rigged, this November vote against the rigged system by voting for Donald Trump. This is truly your last chance to save America from being overthrown by a totalitarian regime of criminals who will crush every last iota of freedom and liberty in America.

Author : Mike Adams

Source : http://www.naturalnews.com/054857_rigged_elections_fake_media_fairy_tales.html

Categorized in Search Engine

Over the past few years we have seen a surge in cyber attacks against well-known organizations, each seemingly larger than the last. As cybercriminals look for innovative ways to penetrate corporate infrastructures, the challenges for brand owners to protect their IP has steadily grown. Fraudsters will stop at nothing to profit from a corporate entity’s security vulnerabilities, and the data they steal can fetch a hefty price in underground online marketplaces.

Whether it is a company with a large customer base that accesses and exchanges financial or personal information online, or a small brand that has IP assets to protect, no company is exempt. While banking and finance organizations are the most obvious targets, an increasing number of attacks are taking place on companies in other industries, from healthcare and retail to technology, manufacturing and insurance companies. Data breaches can have a damaging impact on a company’s internal IT infrastructure, financial assets, business partners and customers, to say nothing of the brand equity and customer trust that companies spend years building.

Battlegrounds: Deep Web and Dark Web

A common analogy for the full internet landscape is that of an iceberg, with the section of the iceberg above water level being the surface web, comprised of visible websites that are indexed by standard search engines. It is what most people use every day to find information, shop and interact online, but it accounts for only about four percent of the Internet.

The remaining sites are found in the Deep Web, which includes pages that are unindexed by search engines. A large proportion of this content is legitimate, including corporate intranets or academic resources residing behind a firewall.

However, some sites in the Deep Web also contain potentially illegitimate or suspicious content, such as phishing sites that collect user credentials, sites that disseminate malware that deliberately try to hide their existence, websites and marketplaces that sell counterfeit goods, and peer-to-peer sites where piracy often takes place. Consumers may unknowingly stumble upon these and are at risk of unwittingly releasing personal information or credentials to fraudulent entities.

 

Deeper still is the Dark Web, a collection of websites and content that exist on overlay networks whose IP addresses are completely hidden and require anonymizer software, such as Tor, to access. While there are a number of legitimate users of Tor, such as privacy advocates, journalists and law enforcement agencies, its anonymity also makes it an ideal foundation for illicit activity. Vast quantities of private information, such as log-in credentials, banking and credit card information, are peddled with impunity on underground marketplaces in the Dark Web.

Waking up to the Threats

The Deep Web and Dark Web have been in the public eye for some time, but in recent years, fraudsters and cybercriminals have been honing their tactics in these hidden channels to strike at their prey more effectively and minimize their own risk of being caught. The anonymity in the Dark Web allows this medium to thrive as a haven for cybercriminals, where corporate network login credentials can be bought and sold to the highest bidder, opening the door to a cyberattack that most companies are unable to detect or prevent.

While Deep Web sites are not indexed, consumers may still stumble upon them, unaware they have been redirected to an illegitimate site. The path to these sites are many: typosquatted pages with names that are close matches to legitimate brands; search engine ads for keywords that resolve to Deep Web sites; email messages with phishing links; or even mobile apps that redirect.

Moreover, as a higher volume of users learn the intricacies of Tor to access and navigate the Dark Web, the greater the scale of anonymity grows. More points in the Dark Web’s distributed network of relays makes it more difficult to identify a single user and track down cybercriminals. It’s like trying to find a needle in a haystack when the haystack continues to get larger and larger.

 

The Science and Strategy Behind Protection

Brands can potentially mitigate abuse in the Deep Web, depending on the site. If a website attempts to hide its identity from a search engine, there are technological solutions to uncover and address the abuse. Conventional tools commonly used by companies to protect their brands can also tackle fraudulent activity in the Deep Web, including takedown requests to ISPs, cease and desist notices and, if required, the Uniform Domain-Name Dispute-Resolution Policy (UDRP).

As for the Dark Web, where anonymity reigns and the illicit buying and selling of proprietary and personal information are commonplace, companies can arm themselves with the right technology and threat intelligence to gain visibility into imminent threats. Actively monitoring fraudster-to-fraudster social media conversations, for example, enables companies to take necessary security precautions prior to a cyberattack, or to prevent or lessen the impact of a future attack. In the event of a data breach where credit card numbers are stolen, threat intelligence can help limit the financial damage to consumers by revealing stolen numbers before they can be used and have them cancelled by the bank.

Technology can even help identify and efficiently infiltrate cybercriminal networks in the Dark Web that might otherwise take a considerable amount of manual human effort by a security analyst team. Access to technology can significantly lighten the load for security teams and anchor a more reliable and scalable security strategy.

In light of so many cyber threats, it falls to organizations and their security operations teams to leverage technology to identify criminal activity and limit financial liability to the company and irreparable damage to the brand.

Key Industries at Risk

A growing number of industries are now being targeted by cybercriminals, but there are tangible steps companies can take. For financial institutions, visibility into Dark Web activity yields important benefits. Clues for an impending attack might potentially be uncovered to save millions of dollars and stop the erosion of customer trust. Improved visibility can also help companies identify a person sharing insider or proprietary information and determine the right course of action to reduce the damage.

In the healthcare industry, data breaches can be especially alarming because they expose not only the healthcare organization’s proprietary data, but also a vast number of people’s medical information and associated personal information. This could include images of authorized signatures, email addresses, billing addresses and account numbers. Cybercriminals who use information like this can exploit it to compromise more data, such as social security numbers and private medical records. Credentials could even potentially lead to identities being sold.

Conclusion

Most organizations have implemented stringent security protocols to safeguard their IT infrastructure, but conventional security measures don’t provide the critical intelligence needed to analyze cyberattacks that propagate in the Deep Web and Dark Web. It is fundamentally harder to navigate a medium where web pages are unindexed and anonymity can hide criminal activity.

Meanwhile, cyberattacks on organizations across a wider number of sectors continue to surge, putting proprietary corporate information, trade secrets and employee network access credentials at risk. Businesses need to be aware of all threats to their IP in all areas of the Internet. Leveraging every available tool to monitor, detect and take action where possible is vital in addressing the threats that these hidden regions of the internet pose.

Author:  Charlie Abrahams

Source:  http://www.ipwatchdog.com/2016/12/14/brand-protection-deep-dark-web/id=75478

Categorized in Deep Web

In 2011, the Finnish Tourist Board ran a campaign that used silence as a marketing ‘product’. They sought to entice people to visit Finland and experience the beauty of this silent land. They released a series of photographs of single figures in the nature and used the slogan “Silence, Please”. A tag line was added by Simon Anholt, an international country branding consultant, “No talking, but action.”

Eva Kiviranta the manager of the social media for VisitFinland.comsaid: “We decided, instead of saying that it’s really empty and really quiet and nobody is talking about anything here, let’s embrace it and make it a good thing”.

Finland may be on to something very big. You could be seeing the very beginnings of using silence as a selling point as silence may be becoming more and more attractive. As the world around becomes increasingly loud and cluttered you may find yourself seeking out the reprieve that silent places and silence have to offer. This may be a wise move as studies are showing that silence is much more important to your brains than you might think.

 

Regenerated brain cells may be just a matter of silence.

c021f7eaf726bd5dbe1d0771e21e9a8e

A 2013 study on mice published in the journal Brain, Structure and Function used differed types of noise and silence and monitored the effect the sound and silence had on the brains of the mice. The silence was intended to be the control in the study but what they found was surprising. The scientists discovered that when the mice were exposed to two hours of silence per day they developed new cells in the hippocampus. The hippocampus is a region of the brain associated with memory, emotion and learning.

The growth of new cells in the brain does not necessarily translate to tangible health benefits. However, in this instance, researcher Imke Kirste says that the cells appeared to become functioning neurons.

“We saw that silence is really helping the new generated cells to differentiate into neurons, and integrate into the system.”

In this sense silence can quite literally grow your brain.

The brain is actively internalizing and evaluating information during silence

066f12d4b43c32a9a66c692b52826153

A 2001 study defined a “default mode” of brain function that showed that even when the brain was “resting” it was perpetually active internalizing and evaluating information.

Follow-up research found that the default mode is also used during the process of self-reflection. In 2013, in Frontiers in Human Neuroscience, Joseph Moran et al. wrote, the brain’s default mode network “is observed most closely during the psychological task of reflecting on one’s personalities and characteristics (self-reflection), rather than during self-recognition, thinking of the self-concept, or thinking about self-esteem, for example.”

 

When the brain rests it is able to integrate internal and external information into “a conscious workspace,” said Moran and colleagues.

When you are not distracted by noise or goal-orientated tasks, there appears to be a quiet time that allows your conscious workspace to process things. During these periods of silence, your brain has the freedom it needs to discover its place in your internal and external world.

The default mode helps you think about profound things in an imaginative way.

As Herman Melville once wrote, “All profound things and emotions of things are preceded and attended by silence.

Silence relieves stress and tension.

da47b0582836795829a5b6b716a314f1

It has been found that noise can have a pronounced physical effect on our brains resulting in elevated levels of stress hormones. The sound waves reach the brain as electrical signals via the ear. The body reacts to these signals even if it is sleeping. It is thought that the amygdalae (located in the temporal lobes of the brain) which is associated with memory formation and emotion is activated and this causes a release of stress hormones. If you live in a consistently noisy environment that you are likely to experience chronically elevated levels of stress hormones.

 

A study that was published in 2002 in Psychological Science (Vol. 13, No. 9) examined the effects that the relocation of Munich’s airport had on children’s health and cognition. Gary W. Evans, a professor of human ecology at Cornell University notes that children who are exposed to noise develop a stress response that causes them to ignore the noise. What is of interest is that these children not only ignored harmful stimuli they also ignored stimuli that they should be paying attention to such as speech.

“This study is among the strongest, probably the most definitive proof that noise – even at levels that do not produce any hearing damage – causes stress and is harmful to humans,” Evans says.

Silence seems to have the opposite effect of the brain to noise. While noise may cause stress and tension silence releases tension in the brain and body. A study published in the journal Heart discovered that two minutes of silence can prove to be even more relaxing than listening to “relaxing” music. They based these findings of changes they noticed in blood pressure and blood circulation in the brain.

Silence replenishes our cognitive resources.

049da49ea55fb677185adba10795f01f

The effect that noise pollution can have on cognitive task performance has been extensively studied. It has been found that noise harms task performance at work and school. It can also be the cause of decreased motivation and an increase in error making.  The cognitive functions most strongly affected by noise are reading attention, memory and problem solving.

 

Studies have also concluded that children exposed to households or classrooms near airplane flight paths, railways or highways have lower reading scores and are slower in their development of cognitive and language skills.

But it is not all bad news. It is possible for the brain to restore its finite cognitive resources. According to the attention restoration theory when you are in an environment with lower levels of sensory input the brain can ‘recover’ some of its cognitive abilities. In silence the brain is able to let down its sensory guard and restore some of what has been ‘lost’ through excess noise. 

Summation

Traveling to Finland may just well be on your list of things to do. There you may find the silence you need to help your brain. Or, if Finland is a bit out of reach for now, you could simply take a quiet walk in a peaceful place in your neighborhood. This might prove to do you and your brain a world of good.

Source:  lifehack.org

Categorized in Others
Page 1 of 2

airs logo

Association of Internet Research Specialists is the world's leading community for the Internet Research Specialist and provide a Unified Platform that delivers, Education, Training and Certification for Online Research.

Get Exclusive Research Tips in Your Inbox

Receive Great tips via email, enter your email to Subscribe.

Follow Us on Social Media

Finance your Training & Certification with us - Find out how?      Learn more