Showing posts with label ars technica. Show all posts
Showing posts with label ars technica. Show all posts

Friday, March 3, 2017

Superbugs can migrate up pipes!

Via Ars Technica, research indicates that splashy sinks can aid in the transmission of superbugs in healthcare facilities.

It started with research into an outbreak in Canada that infected 36 people and killed 12. They knew it came from the sinks, but could not figure out how. No amount of cleaning or disinfection seemed to wipe out the bacteria.

Now, with a new study published in Applied and Environmental Microbiology, researchers may finally have an answer to superbugs’ sink-dwelling skills: They survive in P-traps and can quickly climb pipes. More specifically, researchers at the University of Virginia found that bacteria can happily colonize a sink’s P-trap and then sneak back up the pipe and into the drain by forming a protective, creeping film, called a biofilm, on the plumbing. Once they get to the drain, they only need a burst of water to scatter up into the sink and surrounding, touchable surfaces.

It will be interesting to see how this type of analysis changes sink design moving forward.



Here is the link to the full article:


Wednesday, July 1, 2015

You've got to know when to hold 'em,

Original post:  Jan 14, 2015

Know when to fold 'em,


"The Gambler" by Kenny Rogers is a famous song that discusses a simplified strategy of poker. The rules of the game are fairly straightforward. All players are trying to create the best possible hand as determined by the rules of the game being played. With 52 cards, there are staggering numbers of potential variations that change with every hand.

In an effort to stretch their limits, computer programmers have "solved" many games. Ars Technica explains this phenomenon further:

Computers have made remarkable progress when it comes to beating their programmers at a number of games. Checkers, Chess, and Jeopardy have all seen their champions fall to silicon opponents. In fact, for two games—checkers and Connect Four—computers have calculated the optimal move for every single possible board combination. The best a human can hope for is a draw.

But the games computers have done well with are what are called "perfect information games," where both players have full access to all the knowledge there is to have about the game. (Think of it this way: by looking at a chess board, you know precisely where every piece on the board is.) Lots of games that attract players have imperfect information: players know things their opponents don't. The classic examples here are card games, where some fraction of the cards is typically known only to the player holding them.
It's not possible to solve these in the same sense; you can't know the ideal path forward from a given state because you simply don't fully know what the state is. But it is possible to figure out strategies that make it very difficult for an opponent to exploit them. And, for a specific form of poker, researchers have now done so. Given a set of strategies, no human is likely to come out ahead while playing a computer.

There have been a number of triumphant headlines declaring that computers can now defeat humans at poker. The programming effort was intensive and impressive:

This creates a serious computation challenge, as heads-up limit Texas hold’em needs 3.19 x 1014 individual pieces of data to store the combination of strategies and regret values. The authors estimate that alone would take up over 260TB of disk space. To cut this down, the authors multiplied all of these floating point values by a scaling factor (to make them larger numbers) and then converted them to integers. They then devised a storage scheme that kept these values stored in a manner that's easily compressed. In the end, this cut things down to 17TB of storage.

The program then has to iterate through all possible strategies and assign them regret values. To speed the process of identifying winning approaches, any strategy that had never been chosen before but resulted in a win was immediately tried again. Beyond that, strategies are chosen with a probability proportional to their regret values. The authors also simplified computations by dividing up the huge number of possibilities into a bit over 100,000 subgames based on things like the initial bets and the flop (first face-up cards).

Even with all these optimizations, it took a cluster of 200 2.1GHz AMD cores (each with 32GB of RAM) 900 core-years to reach a point where improvements in outcomes slowed down. At that point, the strategies were optimized such that a person could play it for 70 years, 12 hours a day, using its own optimized strategy, and it would be overwhelmingly unlikely for them to end up ahead by a statistically significant margin. The authors call this situation "weakly solved."

While this is an amazing achievement, it also points out how staggeringly complex card games involving imperfect information can be. The version of poker that the computer "weakly solved" only involves two players at a time in the simplest form of Texas Hold-'Em where bets are limited with each hand. The type of poker typically shown on TV actually involves up to nine players at a time with unlimited bets on every round. This adds multiple layers of complexity which would be well beyond the current program's capabilities.

The Economist wrote another article that sums this up best:

Whether computers will ever be able to solve other forms of poker remains doubtful. Merely removing the betting restrictions on HULHE, for instance, boosts the range of possibilities to 6.38x10161, a figure so mind-bogglingly big that it far exceeds the number of subatomic particles in the observable universe. No amount of improvement in computer hardware will ever make such a problem tractable. The only hope is an enormous, and unlikely, conceptual breakthrough in how to attack the question.



There are, of course, poker-playing programs out there already that play more complicated versions of the game than HULHE. The best are better than most humans. But they, like chess-playing programs, do not actually solve the game in a mathematically rigorous sense. They just process more data that a human brain can cope with, and thus arrive at a better answer than most such brains can manage.


The most interesting computational solution to poker, though, would be one that did work more like a human brain, for instance by looking for the famous “tells” that experienced players claim give away their opponent’s state of mind, or even bluffing those opponents about its own intentions. When computers can do that, mere humans—and not just poker players—should really start worrying.

Here is the link to the Economist:  Computer poker: The perfect card sharp | The Economist

Tuesday, June 16, 2015

I want my USB!

Original post:  Aug 21, 2014

Ars Technica recently posted an article giving the history of the USB. The Universal Serial Bus literally changed the way computers operate. They allow for (relatively) seamless integration between various devices (think of how much simpler it is to connect a printer than it used to be). It also has helped make file transfers and quick backup almost painless.
usb.gif
The newest version, called Type-C is expected soon. It will help eliminate the problem shown above because it will allow the connection to work no matter which side is facing up.
Here are some of the various styles of ports that USB replaced.
serial.jpg
I'm sure many of us have never really had to deal with the frustration of having the wrong type of cable for our peripheral device or had to struggle with determining which one was for the monitor and which one was for the printer. That's all thanks to the foresighted individuals who contributed to standardizing the interfaces and making USB successful.

What will our version of USB look like?

Sunday, June 14, 2015

The most annoying problem in computing

Original post:  Jul 8, 2014

Smartphones are amazing. With their internet connections and hundreds of thousands of apps, they possess some incredible capabilities. Unfortunately, they all share one common weakness--their need for battery power. Every year, the manufacturers replace the current generation with even more powerful chips. While this allows the phone to perform much more efficiently, it also increases the need for power. Coupled with our desire for slim styling, there's no surprise that this is a common sight on many phones:
03-phone-low-battery.w529.h352.2x.jpg
Because of the relentless demand for power, our perception of new phones is that they are still shackled to outdated battery technology. The truth is that batteries have improved tremendously over the years. The chart below shows some of the amazing progress--particularly after Sony's invention of lithium-ion technology in 1992. They just haven't been able to keep up with the incredible advance of Moore's law.
03-energy-density.w560.h366.2x.jpg
There are only two ways to improve the situation. You can modify the programming to use the available power more effectively or you can change the types of batteries that you use.

Google is trying the first kind of solution. Android L, its new mobile operating system, includes a battery conservation feature known as "Project Volta." Among the company's most disturbing findings was that waking up your smartphone for one second causes two minutes' worth of battery loss, due to the number of tasks it carries out automatically each time the screen is activated. Project Volta alleviates this drain with a "JobScheduler" function that allows apps to perform background tasks only when the device is plugged in. Project Volta also includes several features that allow app developers and users to figure out which tasks are using the most battery power, and the operating system itself is set up for maximum energy efficiency. Overall, Ars Technica found that Project Volta increased the average Android phone's battery life by 36 percent.

But there's only so much Google, or any other software manufacturer, can do to fix the battery life problem. Energy inefficiency is everywhere – not just in the operating systems we use, but in the antennas that connect our phones to carrier networks, in the apps we run, and, actually, in the internet itself....

The second approach to battery improvement is to switch the type of battery altogether. Researchers are working on this, too, under the assumption that only substantial technical advances will truly give us the all-day (or even multi-day or multi-week) battery life we yearn for, while keeping our devices as slim as we want them.
One of the more promising new battery technologies is graphene, a strong, thin, conductive material that is formed by bonding together a single layer of carbon atoms. Last year, when researchers at Vanderbilt covered a silicon super-capacitator in graphene, they found that it was possible to make a kind of super-battery that would recharge in seconds, and last for weeks before needing to be plugged in. These super-capacitators aren't ready to be put into iPhones yet – engineers are still trying to make them smaller and denser – but they're out there on the horizon.
.....
Another, more immediately feasible option is a new kind of lithium-ion battery, developed by USC researchers and expected to hit the market in 2015 or 2016, that uses silicon nanotubes instead of the graphite found in traditional batteries. The result is a battery that has three times as much capacity as a normal lithium-ion, and charges in just ten minutes. One Silicon Valley start-up, Amprius, has raised $30 million to mass-produce a similar kind of battery for testing by companies like Nokia.

Here is the link to the full article:  The Most Annoying Problem in Computing -- NYMag

The law of unintended consequences

Original post:  May 15, 2014

Sometimes our best efforts can lead us in directions that we don't necessarily want to travel. Some of the most dangerous threats to our health are caused by the very items that we had hoped to use to shield us from these menaces.

This article from Ars Technica discusses one such situation. It reviews a new book by Martin Blaser titled "Missing Microbes: How the Overuse of Antibiotics is Fueling Our Modern Plagues."

While antibiotics have helped us tremendously, the author cautions that these same wonder drugs are also disrupting the complicated nature of the human biome. In addition to killing the harmful bacteria, we may also be wiping out the helpful ones as well.

I have a personal experience that underscores this theory. One of the worst experiences in my life was undergoing treatment for a stomach ailment while in the Army. They treated me with large doses of antibiotics that wreaked havoc on my digestive system. I think it took me months to get back to feeling normal. Perhaps it was because the antibiotics killed all of the normal flora that my body had cultivated over the years to protect me!

Here is a key excerpt from the article:

The hidden cost of our profligate antibiotic overuse is a prominent theme in this book. Blaser started his career as an infectious disease guy and he did his time at the Centers for Disease Control; he knows full well the damage pathogens can wreak, and just how essential antibiotics are to maintaining public health. This is another reason he advocates curbing antibiotic overuse: to reduce antibiotic resistance, so these drugs will still work when we really need them.

By treating every childhood sniffle with broad spectrum antibiotics, we are killing not only dangerous pathogens but also the hundreds of trillions of bacteria that live in our bodies and have been there ever since the dawn of humanity. He thinks that those species are there for a reason, and he warns that their demise is causing big trouble—with more to come.

Ironically, antibiotic exposure has also been shown to increase susceptibility to subsequent infection, possibly because of the effects that perturbing the microbiome has on immunity.

The author thinks that one of the most important reasons to minimize the use of antibiotics is to preserve bacterial diversity:

Our bodies have ten times more bacterial cells than human cells. There might be trillions of cells of some species—like Bacteroides—but only a few hundred cells of rarer species. As in any ecosystem, biodiversity provides robustness.