Saturday, August 28, 2010

Cool laser trick

Do you have pond water and a green laser pointer? If so you should replicate this guy's awesome setup. All he did was hang a drop, and shine a laser at it, and voila! It reminds me of those planetarium shows I saw as a kid, but with bacteria.


Wednesday, August 25, 2010

Fraser river sockeye: A failure of science?

Since 2006 commercial fishing for sockeye salmon on the Fraser River in British Columbia has been closed. In 2009 the run collapsed with a return of 1.5 million despite a prediction of 11 million. This instigated an expensive government inquiry to find out what happened. The end of Fraser River wild sockeye salmon seemed quite likely. Federal government regulation managed to completely fail in eastern Canada, with the cod fishery, and in 2009 it seemed as if they had done it once again in western Canada.
As a sport fisherman I am always under the strong impression that fishing is always getting worse, year after year. The old photos and fishing stories of huge and abundant salmon seem to reinforce my impression.
Now the 2010 sockeye salmon run is a record run with an estimated 25 million fish returning. This is the largest return in almost a century! Good news for salmon, fishermen and the environment but it is an indication of really inaccurate science. An estimate in late July suggested there might be 11 million returning sockeye, far from reality. At this point I think that it is clear that the Department of Fisheries and Oceans is incapable of making accurate estimates. I am not suggesting that estimating salmon returns is easy but making policy decisions based upon their predictions seems like a bad idea. Is there a very important variable that is not being accounted for?
How is it possible that the estimates can be so wrong? Taking a brief look at the pre-season estimate report for 2010 shows that estimates are strongly based on retrospective data. Perhaps this data is insufficient for estimating returns and other sources need to be utilized. I hope that this results in some changes in the methods used for making these estimates. Even better, the government could fund some more ecological studies of this important commercial resource in order to better understand how an error of this magnitude could have happened.


Tuesday, August 17, 2010

Dig Deeper

Christina Pikas has a post up about the danger of using only sources with recent coverage. That is, not digging far enough back in a literature search. editorial from Nature Reviews Microbiology that says youngsters today aren’t getting the proper baseline literature because they’re relying on PubMed and Google Scholar. They cite the subject area of bacteriophage biology – developed well before the Medline era. Some researchers in this area have created their own bibliography of articles prior to PubMed, but they are concerned about losing access to the publications as they are moved out of the library to storage.
One solution is to make students aware of other databases and resources (your university library likely has access to plenty of them) and encourage them to use librarians as a resource. At our grad school, the first departmental seminar of the year is given by the library staff, though I think it tends not to be taken seriously. An even worse problem is an attitude that if it can't be accessed online - if you have to go the the library and find it on the shelves, or deal with interlibrary loans - then it isn't worth the trouble.

Christina points out a very real danger in ignoring older literature: The case of a Johns Hopkins researcher whose 'current only' search missed an association between the intervention and lung toxicity, leading to the death of a volunteer. This is an extreme case, but there is a real danger of missing important findings in your field or duplicating previous work. Think about the time and effort that could be saved by digging a little deeper!


PLoS Journals Open to Everyone - Except Tobacco Researchers

"At PLoS, we believe that articles in all journals should be assessed on their own merits rather than on the basis of the journal in which they were published."
Assessed on their own merits, that is, unless they were funded by in whole or in part by a tobacco company. A few months ago, PLoS Medicine followed PLoS Biology and PLoS ONE in changing its editorial policy to one of no longer considering research where support comes from a tobacco company.

This change comes mainly on the grounds that tobacco is indisputably bad for health and see tobacco-sponsored research articles as advertising, and they refuse to help enhance the image of the industry. They also have concerns about industry ethics.
we remain concerned about the industry's long-standing attempts to distort the science of and deflect attention away from the harmful effects of smoking. That the tobacco industry has behaved disreputably—denying the harms of its products, campaigning against smoking bans, marketing to young people, and hiring public relations firms, consultants, and front groups to enhance the public credibility of their work—is well documented. There is no reason to believe that these direct assaults on human health will not continue, and we do not wish to provide a forum for companies' attempts to manipulate the science on tobacco's harms.
There is no doubt that smoking is unhealthy and that tobacco companies have acted in dubious ways to support their business interests. Sadly, that's not uncommon in big business - whether it's tobacco, oil, pharmaceutical or junk food. Will PLoS journals be rejecting all papers coming from Merck?

The issue here is that it runs counter to openness and the idea that research should be judged on its merits. Maybe I'm naive in thinking that "Smoking is Cool!", study funded by Philip Morris, will raise red flags for everybody. Isn't that why authors declare competing interests?

Strangely, PLoS justifies the decision arguing that it doesn't happen much anyway.
It is the case that we do not receive many tobacco industry sponsored papers—PLoS Medicine has published none since our inception in 2004 and PLoS ONE only two—and we have made previous editorial judgments on papers that might be favorable to the tobacco industry agenda on a case-by-case basis.
This seems to undercut the argument that Big Tobacco is persistently trying to manipulate science, and by refusing to review the few that are submitted, doesn't that just push them to venues where they might receive a less critical eye?

Worse, this policy could have an effect on tobacco's legitimate contributions to science. How might a ban on tobacco-funded research affect studies exploring plant-based vaccines, agricultural research or even virology (eg. tobacco mosaic virus research or interferon production)?

I love the free and open nature of the PLoS journals and have no love for Big Tobacco, but I'm not a huge fan of this editorial decision.


Saturday, August 14, 2010

3D Blogging

3-D is all the rage now, with many major studio releases coming out in 3D and 3D TVs starting to make their way into homes (even though some people, like film maker Christopher Nolan dislike the technology). Normally the 3D effect is achieved by wearing special eyewear, like the iconic blue/red lenses, to deliver a different image to each eye. Newer devices, like the Nintendo 3DS, are using different, eyewear-free technology for the same effect. This video, grabbed from Joystiq, explains how it works:

You'll notice in the video, mention of another application for the same kind of technology: delivering different images to different viewers on the same screen. This could have some interesting uses - no more conflicts of TV scheduling (though unless directional audio is also part of the package, we'll still have to resort to headphones). And imagine local multiplayer gaming, without the annoying split-screen! Sony has, and has patents on a multiplayer stereoscopic system.


Thursday, August 12, 2010

Beer Stats!

Extremely unmanly of me is the fact that I don't really find sports statistics interesting. Sports stats are a common default male conversation for which I usually have nothing to contribute. However, I do enjoy watching sports, and by sports I mean hockey, provided I am consuming beer. Next time I'm at the pub enjoying a nice india pale ale and someone starts on about Crosby vs Ovechkin giving me an earful of numbers, I think I'll try to switch the conversation to numbers that really matter: Beer stats. Your pint vs mine.


To many drinkers, this is the only number that matters, the higher the number the less you need to consume to feel the effects of the ethanol produced from the fermentation process. Alcohol/volume (abv) is a worldwide standard and is usually the only beer stat you can find on the container. It is simply the amount of ethanol expressed as a percentage of volume. Conventionally brewed beer, using a conventional yeast and a single fermentation step is in the 2% to 12% abv range (almost always in the 4-6% range). Higher alcohol content can be achieved using alcohol tolerant yeast strains, adding sugar during fermentation and by fractional freezing that produces so called ice beers. The process of fractional freezing involves cooling the fermenting liquid until ice crystals float to the surface then removed. This process removes water thus increasing the ratio of ethanol to water and therefore increases the abv. If you really want the Sidney Crosby of beer abv you could try a $800 bottle of The End of History by Brew Dog. Maximum Ice (7.5% abv) drinkers will be impressed with The End of History's 55% abv.

Degrees Plato
The coolest sounding statistic, degrees Plato, unfortunately isn't very useful when comparing pints you are drinking as it is mostly a stat useful during the brewing process to determine the final abv. Degrees Plato is the %w/w sugar ie the grams of sugar per 100g of wort. Since sugars are consumed and converted to ethanol during fermentation, and thus not what was origonally put into the wort, the determination of degrees Plato of beer is usually determined using measurements of specific gravity. Specific gravity is the ratio of the density of the wort or beer to the density of water. Specific gravity is used instead of direct density measurements as at a standard temperature and pressure specific gravity is more easily measured using an instrument such as a hydrometer. The difference in specific gravity between the starting wort and the finished beer can be used to calculate the ethanol produced during fermentation since decreased dissolved sugar and increased alcohol decreases the specific gravity.

Standard Reference Method
This is a quantification of beer colour. The standard refernce method (SRM) is 12.7 times the absorbance of filtered beer at 430nm. So the SRM is really only measuring "darkness" and not truly colour and it looses linearity at higher SRMs. There is a highly related system EBC which also uses an absorbance measurement at 430nm, and I'm unsure of which is in more common usage in Canada. There are some other more sophisticate systems to better quantify colour of beer. For example Tristimulus colour is determined by measurements at many different wavelengths and describe beer colour in a three dimensional colour space.
This is an especially cool stat for those that like uber dark beers. Nothing like bragging about your pint of Guiness with an impressive SRM of 40, until some guy with an Imperial Stout reminds you that his pint has an SRM of 70.

International bitterness units
Hops are added to beer as a flavouring and stability agent. Interestingly from kamel:
Incidentally, the hop alpha-acids also have antibacterial and anti-inflammatory properties. Thats why traditional India Pale Ales (not that Keith's crap) have a strong bitter, hop taste. Extra hops were added to help preserve the beer on its journey from Britain to India.
The determination of international bitterness units (IBUs) involves extraction of the bitter tasting hop alpha-acids and quantification using UV absorbance or HPLC. Since this involves sophisticated laboratory equipment there are alternative methods for the small scale or craft brewery. One involves adding hop alpha-acids to a beer of known IBUs, like Bud, until it has the same bitterness as the beer in question. While this method has the benefit of drinking lots of beer for replicate samples, it, and the IBU measurement itself, suffer from a disconnect between IBUs and perceived bitterness. Malty beers with the same IBU as a pale ale will be taste less bitter.

While I haven't seen quantification of beer carbonation, in beverages it is reported in grams/litre and can be determined by infrared absorption at 4.27 um. Such a stat would obviously only refer to an untapped keg or unopened bottle/can. Some beers are nitrogenated instead of carbonated, usually stouts and British ales. Again I have not seen a beer report an amount of nitrogenation, however this would be a useful and descriptive statistic.

More Stats
Kamel's great post about beer foam entitled, Good Head has some information on the composition of beer head. The denatured protein LTP1 is the main structural component of the head along with hop alpha-acids. This reeks of a need for quantification. This information along with carbonation/nitrogenation content would inform the drinker on proper pouring to ensure the best possible head.

Inspiration for this post was from discovering this very useful quantification of beer greatness as my current favourite beer, Tree Breweries Hop Head, prominently displays the fact that it measures 45 IBUs. I'm also enjoying Phillips Hop Circle IPA however I don't know it's IBU. This is unfortunate as I think beer stats are a great opportunity for brewers, especially craft brewers to advertise and quantify their uniqueness. Beer geeks want to know! Yes this might lead to more of the ridiculousness exemplified by Maximum Ice, and I would also say that there is a subjective quality to beer that perhaps would get lost in the numbers. But sports statistics are popular and sports geeks know that just like Ovechkin looks better on paper, Crosby's the real winner.



Scienceblogs Pepsigate led to an exodus of bloggers. Some of them maintain independent blogs (Carl Zimmer has maintained a list of their destinations). Others formed a new blogging collective: Scientopia.

If you're looking for other places for 'one-stop science blog shopping' hosts a stable of over 20 science bloggers.

There's also the Field of Science network (which Bayblab was once invited to participate in) and its group of around 20 blogs.

And of course Discover blogs and ScienceBlogs are still big players. (I've been particularly enjoying Rhett Allain's Dot Physics)

That's 5(!) solid science-blogging collectives.

And if you want some individual reads between Bayblab posts, check out:

* If Physical Books Are Dead in Five Years, How Do the Poor Find Books? Whither (or Wither?) the Library? by Mike the Mad Biologist

* The Death of Universities at Sandwalk

(hmm... books and universities both dead in 5 years?)

* Peeing in Space at Neurotic Physiology

* Are grad students professional scientists? at Genomicron (this is an ongoing discussion spanning several posts)

* Basic science: An "obstacle" to students who want to study medicine? at Respectful Insolence


Wednesday, August 11, 2010

But Could He Make a Disease so Great Even He Couldn't Cure it?

Here's the abstract from a paper published in Virology Journal:
The Bible describes the case of a woman with high fever cured by our Lord Jesus Christ. Based on the information provided by the gospels of Mark, Matthew and Luke, the diagnosis and the possible etiology of the febrile illness is discussed. Infectious diseases continue to be a threat to humanity, and influenza has been with us since the dawn of human history. If the postulation is indeed correct, the woman with fever in the Bible is among one of the very early description of human influenza disease.

Infectious diseases continue to be a threat to humanity, and influenza has been with us since the dawn of human history. We analysed a case of high fever that happened 2000 years ago in Biblical time and discussed possible etiologies.
At least they formally ruled out demon possession. The rest of the paper isn't much longer and is a pretty bizarre piece of peer-reviewed research. There's too much to write about it, I don't even know where to begin. Luckily Tara Smith at Aetiology already did most of the work.

Who says the quality of peer-reviewed publication is on the decline?

UPDATE: The paper has been retracted.


Tuesday, August 10, 2010

Blogger is evil

The Pepsigate scandal over at scienceblogs is starting to look pretty harmless in comparison to the recent news that google, which owns on which the bayblab is hosted, is an enemy of net neutrality. I encourage everyone to familiarize themselves with the concept of net neutrality if you already are not, but basically the internet as we know it today is founded on this concept of network neutrality. A proposal from Google and Verizon would enable the creation of a tiered internet meaning that large established sites could pay ISPs for better connections. An article in Wired has one of the best summaries of the situation that I have read. Since this Google/Verizon proposal is in conflict with Google's previous stance on net neutrality and, depending on your view, breaks their famous motto, "don't be evil" I am even less inclined to trust the internet giant. Currently this is just a proposal and it is possible the FCC will be able to assert new powers to swart this plan and Obama has stated he is commited to net neutrality. However the fact that these two companies with somewhat competing interests came together on an agreement will make it more difficult and has revealed their true intentions.
Here in Canada the situation is a bit better, however, I'm sure that as usual changes in US policy will directly affect us here.
This, on top of the privacy issues with google, mean that personally I'm no longer considering an android phone and am going to start moving away from a recently created gmail account in a fairly lame attempt at a protest. This protest will be so lame that the bayblab won't be going anywhere.


PhD Illustrated

Trust me, this will make sense when you visit this blog.


Friday, August 06, 2010

Cancer Carnival #36

Once again, it's the first Friday of the month, which means it's time for the Cancer Research Blog Carnival. The Carnival relies on posts and hosts, so be sure to submit your posts for next month, and if you're tired of seeing it here on the Bayblab, drop us a line to sign up as a host. On to the posts...

First up, we have a post from Byte Size Bio that arrived just after the last carnival went live. In it, he looks at a paper investigating a role of pseudogene mRNA in regulating tumour biology. By interacting with miRNA, some of these pseudogene mRNAs may act as tumour supressors or oncogenes!
PTEN1 is a pseudogene which shares a very recent common ancestor with PTEN. A mutation in PTEN1 prevents it from being translated into a protein product, but it can still be transcribed to PTEN1 mRNA. Laura Poliseno and her colleagues have shown that PTEN1 mRNA, being very similar in sequence to PTEN mRNA attracts miRNA molecules that target PTEN mRNA. In other words, PTEN1 mRNA lures PTEN-specific miRNA molecules away from PTEN mRNA, lowering the number of inactivated PTEN mRNAs.
This is a pretty cool finding, and the post was an Editor's Selection from, so be sure to check it out.

Here at the Bayblab, Rob points to recent research about fructose metabolism in pancreatic cancer. Orac, at Respectful Insolence, also writes about this study.
So how was fructose metabolized in pancreatic cancer cells? For the most part, it was used to generate nucleic acid synthesis. Compared to glucose, fructose induces is preferentially metabolized via the nonoxidative pentose phosphate pathway to synthesize nucleic acids and increase uric acid production. What this means is that fructose provides the raw materials for cancer cells to make more DNA, which cells must do in order to divide and proliferate.
At the Spittoon, 23andMe's blog, their SNPwatch feature highlights mutations associated with liver cancer in Hep-B infected patients.
Rs12136376 is near several genes: KIF1B, UBE4B and PGD. Multiple lines of evidence suggest that one or more of these genes are plausible candidates for HCC susceptibility. Changes in the region of the genome where they are found are commonly seen in many different cancers, including HCC.
Details of the SNP, the analysis and outcomes are all explained in the post.

Keith Robison at Omics! Omics! discusses a project undertaken by Genentech and Affymetrix to scan over 400 tumour genomes for mutations. Keith has a really good explanation of the methodology while MassGenomics has a broader overview of the study findings.

Finally, in another piece of research blogging, Michelle at C6-H12-O6 looks at a paper invesigating the effects of caloric restriction on glioblastoma multiforme, a malignant and invasive brain cancer.
As I said, CR-induced ketosis has been known to reduce non-invasive brain tumors. It appears that cancer cells are highly dependent on glycolysis for energy and for some reason (unknown to me, although I'm sure there's literature out there on it) seem incapable of mitochondrial respiration. As such, they cannot use ketones for energy like healthy cells can. Up until now, this hasn't been tested in more invasive cancers, where the tradeoff in neurological impairment might be worth it to stop or delay the spread of the cancer.
This is a nice write up with a good description of the background metabolism. And I don't just say that because of my own interest in CR.

That's it for this month's Cancer Research Blog Carnival. For older editions, visit the Carnival Homepage. Don't forget, the CRBC has subscription options; you can follow by email or RSS feed. An aggregated feed of credible, rotating health and medicine blog carnivals is also available. For a broader collection of science-related blog carnivals, sign up for the Science, Medicine, Environment and Nature Blog Carnival Twitter Feed.


Wednesday, August 04, 2010

Awesome Jesus Gecko Video


Tuesday, August 03, 2010

Fructose induces nucleic acid synthesis in pancreatic cancer cells

A quick read of an article summarized in the mainstream media suggests that at least some cancer cells metabolize glucose and fructose differently. This defies the conventional wisdom that, metabolically speaking, these sugars are interchangeable. While fructose fed cancer cells do not have increased proliferation, the study suggests fructose is used more readily for nucleic acid synthesis. Glucose in the same cells is used primarily for energy resulting in lactate and CO2 production. Additionally another recent article demonstrates that breast cancer cells exhibit a more aggressive phenotype when using fructose as a carbon source. As the Reuters article suggests, this may be of public health significance as consumption of free fructose has increased 10 fold between 1970 and 1990.Link