Tuesday, 21 January 2014

Twenty skills that I -or any Ph.D- has that are in demand

A while ago Christopher Buddle posted a blog on SciLogs about what you needed to know before becoming a professor. Many of those skills are the ones in demand outside of academia. 

It got me thinking generally what skills I have amassed over a Ph.D, Post-doc and faculty position. For any other "recovering scientists" reading this please feel free to steal this list, add to it or perfect it. Any comments or critique would be welcome. 
  1. Project managementover my academic career I managed to publish several papers in top journals. Some required precise planning of tasks and experiments on a short deadlines against competition. This requires ensuring that each set of experiments is finishes with a high quality deliverable.
  2. Human resource- as a professor I had to hire, fire and develop staff. This included students and early career professionals where you are balancing what they are capableof today, with their career goals. I picked projects for them that they matched their skills.
  3. Project planning- a PhD is a set of projects, that need to be planned out, with a full timeline, deliverables and costs set out. In addition a key part of a successful PhD or post-doc is knowing when to kill a project.
  4. Stakeholder relationship- each stage of a PhD requires you to set out goals with your faculty advisory committee. These people will provide guidance and advice for where you should spend your time. Part of success is ensuring that you cogent show progress toward each of the members ideas ofyour success. The stakes get higher as you move to a post-doc where you are expected to manage the project and manage the expectations of your boss.
  5. Budget building- as a professor I needed to build RFPs, prioritize purchases based on project needs-as well as the long term strategy of the lab, source infrastructure, mange vendors and raise funds.
  6. Publications- part of a scientists job is to communicate results to the community. This includes typical writing skills but also graphic design, matching the presentation visualizations to the message and audience.
  7. Data management- all aspects of data management including ensuring high quality data recording metadata, designing database considerations. Build database querying, integrating public and owned data into a complete set.
  8. Analytics- a key part of my PhD was defying how to quantitate behavior and images. This requires a clear analytic method that allows reproducibility through clear, logical rubric for scoring purposes.
  9. Web based research-not just the query but also the decision on good sources and bad ones.
  10. Public speaking- I have given hundreds of lectures to all sizes of groups both lay groups and expert groups. This gives me a large set of tools to fall back on for presentation design
  11. Individual drive- to do a PhD you need to an internal drive to do what must be done.
  12. Intellectual flexibility- as part of my PhD I learned at least 12 different technical skills at a high enough level to use them in peer reviewed publications and teach them to others. I learned these through reading and just dpingi didn't need to be walk through them multiple times.
  13. Records management- my laboratory work in a high demand, high competition environment. We needed to have all experiments documented in a way that would stand up to legal review and could be used as part of a patent process.
  14. Understanding of several healthcare related regulations- part of my work was related to drug discovery and some of it was in collaboration with clinicians. Meaning that we ensured that all documents and protocols met the required standards.
  15. Graphic design- genetics is a hard area to explain without pictures. I designed many successful visualizations using Photoshop, powepoint and old matte photography techniqies.
  16. Process design- my laboratory was at the bleeding edge of genetics. This meant that we were constantly building new processes and testing resources that would be best for that process.
  17. Process optimization- due to the unique methods we constantly needed to set production standards and build analytics that allowed us to evaluate and optimize process and make changes that reduced cost and increased reproducibility and accuracy.
  18. Contract negotiations-as part of my job, I have negotiated service contracts, terms of employment 
  19. Fund raising- academic labs are also look for new sources of funding and interacting with potential investors/funders
  20. Strategic product planning -a key part of success is understanding where government priorities are now and the next five years to develop a funding strategy. Successful scientists also have a understanding of the competitive landscape and position their employees and infrastructure to keep up.

Friday, 18 October 2013

Lets focus on the the actual science not media fluff

Enough already all of the articles and blogs about how "epigenetics" is the cause of aggression or socio-economic disparity. 

Epigenetic modifications to the genome are not more important than genes, they are not separate from genes. They are how we regulate genes, genes are not binary- they are not on or off. They are used at certain levels for certain tasks ("grow an arm" will use the same genes as "grow a heart" but in much different dosages).

Epigenetics are akin to a thermostat. You wouldn't blame the thermostat for causing winter? No you use the thermostat to respond to winter.

Epigenetics is the same! It is the control mechanism that the body uses to respond to the environment. In this case the environment being EVERYTHING outside the nucleus of a single cell. Yes everything, cell signals, hormones, hunger, emotions, temperature, toxins- everything. 

Understanding epigenetics is like particle physics, we can be statistically certain but we can NOT be definitive about the role of any single epigenetic modification's role in a disease state or trait inheritance. 

It is mind-boggling how complex the potential role of epigenetics is in any disease. We do not even understand how it works at the single cell level, and we have people suggesting that "epigenetics" explains complex traits just because nothing else has explained that trait?!.....its frustrating. Epigenetics is not magic, it is at least 20 different types of gene regulation. That is all it is...its boring fundamental science. 

I get it; its hard to explain epigenetics but we are reaching Fox news area of truthiness with some of the blogs and "news" about Epigenetics. We, as the educated science community, need to hold ourselves to a higher standard. Epigenetics is part of everyday life; differences in twins, calico cats, Zebra spots. Lets appropriately educate the public using everyday examples and then go deeper. I have found people are more excited by the basics and a honest approach than being oversold. We get that enough nowadays with the 24 hour news cycle and 365 political campaigning. 

Lets not be part of the solution by "misspeaking" the wonderful nature of epigenetics. We should be exciting the public to the potential rather than selling snake oil.

Saturday, 3 November 2012

Funding research in the new (poorer) world


The world has changed. Money is tight, the large foundations and governmental granting agencies are risk averse........which means senior scientist will get the $. 

This means that innovation is going to die. Once you get to be a senior scientist you don't have time to fail you have to feed the beast. I think the best allegory is Wall St.- Yes "to big to fail, I need a bail-out Wall St." is exactly what most large labs in the world are!

Rather than launch on some Quixotic diatribe about how bad this is for health and science as endeavor,  Ill just talk about how to make it irrelevant.

The answer is small foundations. They have the focus, passion and community to start a real long term relationship with young scientists. Brand new, too ignorant to know better scientists who got their own lab by being the most innovative and best prepared post-doc is exactly the one who will take the chance-if their is money involved. 

All foundations seem to be focusing on drug discovery and biomarkers. This is a great space for smaller disease focused foundations to occupy. The problem is that most of the organizations do not have the gobs of money to follow it through in a comprehensive manner.


Getting effective novel therapeutics requires engagement of talented, creative scientists


For disease focused research foundations this can be difficult. In the current economic environment foundations must have some method to "hook" scientists whether it be large per year grants, limited restrictions on spending or speed of review. 


For rare diseases this means getting in their early phase when they shape and limit the vision of their lab. Rare diseases research requires passion and a way to find general funding that will maintain the lab. 


In this day and age when peer based mentoring is sadly lacking a foundation that can provide some guidance on where/how their researchers can leverage funding and expertise can gain loyalty and expand by word of mouth. This will then lead to larger "sexy" studies and fund-raising. 


This kind of thinking can work hand in hand with maximizing fund raising as you can tell fund raisers that a large portion of their money goes directly to attempting to cure or ease their disease rather than greater good. 


Overall:Small foundations should look to maximize the effect of their funds to effect change in their specific disease. 

Through targeting 2 areas of research:
1 Cellular characterization of the disease (ie what are properties that are different between normal & disease)
2 Drug/therapeutic design and testing-no matter how speculative. This assumes that the idea or test system can pass peer review 


The 2 areas would have separate competitions, 2 separate funding paradigms:

1 Short funding cycle-a micro-finance model. Short grants with quick turn around. 2 year grant with  a hard progress report with mutually agreed upon measurable progress. 

2 A prestige grant larger "no questions asked" funding 5 year funding with no reporting for 2 years. Again mutually agreed upon defined goals that MUST be met to receive final 3 yrs of $    


The grants would be open academia and small biotech. There would also be a bonus for academic lead Pharma-academic RFPs. There would be a significant and clear partnership NOT just "in-kind" contributions. 


The research and fund-raising would have a high degree of back and forth. The foundation would hold a stakeholders conference where selected funded scientist would come and explain the state of research in layman's terms. 


There has to be greater out-reach from the scientists at foundations. The Office of the CSO should engage in various forms of social media to engage and find funds (with the guidance of the Exec board). This can no longer be left to that summer intern who just finished Bio 101. The public is too smart for that and frankly if I was looking for a small foundation for funding I want to know that the scientists are engaged. It should be an expectation not just a hope that scientific merit is judged by scientists. 

Sunday, 16 September 2012

Time for some convergent evolution in knowledge management

As I move from the ivory tower of Neuroscience to the practical, business related advice that Info-Tech gives clients on their IT environment I'm amazed at how many parallels I see in the needs and the solutions in all kinds of human endeavours.

For example, I just finished talking to a vendor about how Enterprises can manage and maximize their content (images, documents, blogs, etc). Much like my own thinking on this, @OpenText is convinced the core issue is about information movement not what it is stored in (i.e. a word doc VS a excel).

For me this comes back to a practical problem that I had as a graduate student. My Ph.D was on how gene expression relates to brain development. The brain develops in a fascinating manner; it starts out as a tube that grows outwards at specific points to build the multi-lobed broccoli-esque structure that allows all vertebrates but particularly mammals to have diverse behaviours and life long learning.These complex behaviours rely on an the immensely diverse set of brain cell types. Not only is their great diversity of cells but each cell needs to get to the right place at the right time.

Think of a commute on the subway; not only do you need the right line but if you don't get to the station at the right time you won't get to work on time. This could lead to you getting fired. For brain cells this could lead to death. For the organism it could mean sub-normal brain function-and potentially death. The fact that the process works is a testament to the astounding flexibility and exception management built into cells by their epigenetic programming.

There is however one big problem with the type of brain development: the skull. The skull limits the number of cells that can be created at any given time. Practically this means that the level of control that must be exerted on the number of any one cell type is very tight.The control comes from coordinating which genes are expressed in each cell type to allow cells to make decisions on the fly. Usually it starts by the brain cells take off in a random direction that then informs them of what type of brain cell they will end of being when they arrive. The cells then proliferate as they move based on the contextual information that they receive about how many more cells are needed. This all happens through cell to cell communication and rapidly changing patterns of gene expression.

(Wait for it Ill get back the parallel problems honest....)

As you can imagine this was (and still is) a daunting problem to investigate. My research involved a variety of time staged images; reams of excel workbooks on cell counts, brain size; word docs on behaviour and whole genome expression sets. It was the a big data problem before the phrase existed. (Business parallel No.1). In reality I had no problem keeping track of all this data and looking at each piece and doing the analysis on each piece. I had very good notes(metadata) and file naming conventions (classification) to ensure that I could easy find the file I needed. I was in effect a content management system (Business parallel No.2).  The problem was synthesizing the separate analysis into a cogent piece of information i.e. something that can be shared with others in a common language that allows other to build their own actionable plan. (Business parallel No.3).

Any scientist reading my dilemma from 15 years ago can probably relate-and so can anyone else that uses and presents information as part of their job. The reality is that technology can only solve the problem if people recognize the problem and WANT to be systematic in their habits.......the will power to be repetitive in their approach to work is sorely lacking from most knowledge based workers. Ironically a lack of structure kills creativity by allowing the mind too much space to move within. The advent of the online databases by NIH from genomic, chemical and ontological data has given a framework for scientists to work within to quickly get up to speed in new areas of investigation. Unfortunately this has not trickled down to individual labs (again more proof that trickle down anything doesn't work effectively-its just not part of human nature).

This lack of shared framework across multiple laboratories is becoming a real problem for both Pharma and academia (and everyone else). The lack of system has led to reams of lost data and the nuggets of insight that could provide real solutions to clinical problems (Business parallel No. 4). This also leads to duplication of effort and missed opportunities for revenue(grant) generation.(Business parallel No.5).

From a health perspective, if we knew more about what "failed drugs" targeted, what gene patterns they changed and what cell types they had been tested on we could very quickly build a database. From a Rare disease perspective the cost of medical treatment is partially due to the lack of shared knowledge. How many failed drugs could be of use on rare diseases? We will never know.

This is a situation where scientists can learn from the business community for the technical tools to really allow long term shareable frameworks. These technical controls are available at any price. Conversely the frameworks and logic that scientists use to classify pieces of content to link them have lessons for any knowledge worker.

Its time for some open-mindedness on both sides, the needs for all kinds of organizations and workers are converging-too much data, too many types of data, not enough analysis. Evolution is about taking those "things" that work and modify them for the new environment.


Thursday, 28 June 2012

What is epigenetics?

It's a question that I am often asked. The answer is complicated. Since, in my opinion, the viewpoint of the field shifts based on where the next sexy science that can generate money is coming from. 


So lets start with the word epigenetics and its semantic meaning. If we start by breaking down the word, we can see what the word and field has come to mean in the last twenty years. The first part epi is from Greek meaning above. Genetics....well realistically it is a galaxy of smaller fields dedicated to studying what genes are, how they are involved in disease and development, and how genes regulate each other. 

So epigenetics is the study of processes above genetics.........which is where the etiology fails to be useful. Hence the confusion amongst scientists and the public at large.


In a practical sense epigenetics is a in-depth look at how genes regulate each other. Genes are, at their strictest definition, the precursors to the proteins that perform (most of) the jobs that make life possible. 

While genes are the stars of the show, in terms of regulation of biological processes, they are the least interesting part of the genome. The interesting part is the so-called junk DNA or more accurately non-coding DNA. If genes are the stars, then non-coding DNA are the role players and the scenery that move the show forward. 


The dance of how these proteins and modifications is choregraphed is epigenetics. 


Non-coding DNA regions control which genes are expressed and when they are expressed. This occurs through RNA molecules, binding of proteins and enzymatic modification of these proteins. Its more complex that what I am highlighting but the core message here is that Epigenetics is about controlling access to the genome by RNA and proteins through the non-coding regions of genomic DNA. 


Epigenetic processes are the key to providing organisms the flexibility to respond to the environment. So what about these processes make them so important?  As with any regulatory process-it depends. 


The best analogy is the "genome as the book of life". If genes are the words by which an organism is "made" then epigenetics is everything else in the book. At the lowest level it is the sentence structure that allows the words to have meaning. At the highest level it is the chapter order that gives the story a linear order. Unlike a real book each tissue in the body can shuffle each of these elements on the fly....a choose your own adventure book based on cell type. Further flexibility arises from how each individual cell interprets the story that it is given.

While the book analogy is interesting and useful, the beauty of the system in my opinion is that all of this is managed through biochemistry: small differences in enzyme kinetics and subtle changes in protein binding. This biochemistry leads to a wide variety of markers that can be used to denote parts of the genome.


The cell uses different types of markers for each of the particular categories; grammar, pages, chapters. Each of the different markers that the cell can use for these categories has varying ease of use. The ease of use is a function of the biochemical processes by which the cell adds or remove the markers. If this all seems like overkill just to express a gene...it sort of is. Adding layers and differing ease of use allows the cell to add very tight regulatory control. This allows the cell to mix and match different markers to make bookmarks or highlight favorite passages, often used paragraph, etc.  

In general the chapter markers are direct methylation of the DNA. As the name suggests it is the addition (or subtraction) of methyl groups to DNA. This alters the affinity of DNA to a subset of proteins that occlude the DNA-basically hiding the genes. Removing the methyl groups abolishes this occlusion.

Sentence structure this is more complicated but in general there are 2 types of post translation modifications that are most commonly seen as having a definitive role in this process. This usually involves the structural proteins that surrond DNA. These proteins are the histones and they allow 2 linear metres of DNA to be packed a volume of 0.00001 metres. An amazing feat in and of itself!

Histones can be modified in a wide variety of ways too numerous to mention here. The combinatorial placement of these proteins and modifications on DNA allows for a very precise grammar to be used. So precise that cells can communicate exactly which protein should be expressed to their neighbor. Given that there may be as many as 31 thousand that is quite impressive. What are histones? that is another story for another post. The bottom line here is that these proteins control access to the DNA. The modifications on these proteins act to either loosen the structure and increase access or glue the proteins together blocking access.

So what you say? Well the answer is variation: variety in cells (brain VS muscle), variations between twins, variation in cell response, variety in drug response, variation in disease. Epigenetics is the root cause of variation at every level: species, organism, tissue, cells.

So until we understand how this biochemical signature is modified we can't really understand how cancers vary from person to person or in a bigger picture how we retain biodiversity. 

Saturday, 9 June 2012

Health IT and clinical research

I recently returned from the E-Health Canada conference in Vancouver. I was there as an analyst for Info-Tech research group. I spoke about secure use of consumer devices in healthcare and the potential of cloud computing as a flexible model to deal with CoIT.


I was pleasantly surprised by the IT knowledge level of the nurses and doctors that attended the conference. Why was I surprised in this age of consumer devices and self service tech? Well I spend a fair amount of my time talking to and about IT departments. The impression that I get from many of them is that while most people can set up their email on their phone they remain largely clueless about the actual tech itself.  


Well, there is certainly a core of nurses and doctors that understand the tech and have really great ideas for how to make it work better in the context of healthcare delivery. I was left with the feeling that many remain frustrated with the current solutions and the pace that E-health is moving forward. The major areas of frustration were around content delivery and system upgrade for usability. I would summarize it as “You must do something; protect the data but be flexible on the device used”. Technology should allow doctors to spend more time looking and talking to patients not with their nose buried in a tablet. Just placing a tablet in the doctors’ hands doesn’t mean it will lead to increased use of the IT solutions for healthcare.


While some may point to this as a technology problem, it was clear from talking to the vendors that the solutions available today can meet the demands that hospitals are placing on IT. As someone with a past on the research and has dabbled in clinical research it was refreshing to know that there is a variety of solutions out there to make clinical research easier. What was interesting was how similar healthIT problems are to many other industries. In my opinion the issue is now about getting the best out of the tech not when will the tech exist.


 In other words its about getting the users and the admins on the same page. 


As someone with a deep passion about Rare diseases and use of high level biomedical research its somewhat frustrating that the system that is in use today is so antiquated. The upgrades available today could add so much intelligence to how we treat Rare diseases in particular. 


The new areas of stem cell therapy and epigenetics hold a HUGE promise if we can understand the relationship between disease and biology in these patients. Since they are so rare at any given location it is imperative that we have a way to share the data that is safe enough to share patient history across borders.

Wednesday, 7 March 2012

Open access and generating senior scientist buy-in


There is a vibrant, intelligent but completely impractical debate happening right now around publishing scientific papers just use #openaccess to see the volume of twitter posts. The concepts are great-faster dissemination of information, cleaner peer review process, greater collaboration. 

My issue is the lack of reality check or way to bring it into real world use. Science at its heart is a glacier- cold, unworried, progressing forward in an unstoppable manner. It also has the inertia of literally hundreds of years and in general a highly conservative set of guidelines. What needs to be fought against is this idea that peer review requires a third party outside of the scientific community. The internet and the transparency that it brings makes a third party paid watch dog unnecessary, there are plenty of folks on the internet just looking to "yell gotcha".

There is some value to the conservative mindset that works well for society and Progress-the burden of proof. The conservative guidelines protect Science from making too many mistakes and jumping to conclusions based on the unexplained. It's what prevents Einstein's theory from being torn down by a faulty wire. It also means that the younger generation of scientist must bring ideas to the hallowed halls of old science and prove that it will work by bringing real world suggestions that will fix the interwoven problems of publication, attribution, grants, jobs, and tenure. 

Otherwise its just pissing in the wind and complaining. Not my idea of what scientists do. They come up with theories, models and explanations-that are then roundly torn  down by their peers and rebuilt into a better product.

With that said lets take a look at how open access and "non-publishing" publishing could work in the real world*. 

First some challenges that I see as the main drag on process:
  1. Comparative analytics. There needs to be transparent metrics to gauge the value of the research to the larger community. This can be an issue since every scientist does the most important research to the world.
  2. Control over content-this may seem trivial but if I am the principal investigator I'm not sure I want every post-doc and grad student uploading their crappy, blurry images. The metadata surrounding who gets tagged, whats get tagged and the terminology is vital to ensuring that the data can be reviewed by everyone who may be interested. 
  3. Clear lines of permissions-What role do collaborators play in making this public who gets to post it? where is it hosted? The university still has some co-ownership. This is not something that can be decided afterwards it has implications for grants, etc.
  4. What about intellectual property? I'm all for open collaboration and sharing data with academics but what guarantees are there that pharma and biotech will play by these rules? The beauty of public research from the grantor's(read government) point of view is it maximises their investment. That is lost if [insert huge pharm company here] comes along and builds a drug based on that data and charge the public obscene amounts of money.
  5. The content that is made public has to have a finished look. This can't be just thought vomit. It must have some clarity of thought and citations. Put some thought into it! Distill by putting into context, why should I care? how does it support or counter the current models. Proper citation through hyperlinks. It should be peer review not crowd sourced editing.
  6. Control over comments. Comments can't be removed just because someone says your data quality sucks. This has to be transparent warts and all. You have to take reviewers suggestions of more experiments seriously. As someone who has reviewed for a variety of journals (top five to lower tier) nothing is more frustrating than taking the time to review and having someone completely disregard it. 
  7. Buy-in from senior scientists. Science is largely a oral history, the reality is that for most methods just having the paper is useless for understanding how to do it or where the problems can arise. The insights from senior scientists that have seen it all are required for this to be truly revolutionary. 
  8. Buy-in from Administrators. I for one do not believe that open publishing will be any less time consuming or cheap for the scientist. Someone will have to maintain the database for the content and ensure that there is enough capacity for the videos, excel files and what not that will be made available. This will have to be maintained for decades with back-ups, etc. Someone will have to know where and what there content there is AND when is should be taken down and replaced. Right now the university covers those costs for each departments website, unless there is a clear benefit i.e. grant money, donors, clarity on which faculty members are doing well and in a perfect world which require more help.

Now the potential solution:

A dedicated website where each lab publishes its own data. Personally I don't be believe that all data should be available but some labs and branches of science believe that is best. Allow the system to have some flexibility, some fields are inherently more competitive and technically nuanced than others. Scientists need to retain the ability to check data quality, accuracy and potentially fundamental flaws in experimental setup as a lab group prior to making it public. 

I like figshare and creative commons and all of the really great tools that are coming at breakneck speed. I love the idea of posting of all of the data but I truly believe that my job as a scientist is to analyse the data not just find/acquire it to send to others. If open access publication does not include this it will set back cancer and other highly nuanced fields in biomedical sciences years. These fields have moved at breakneck speed because such a premium is placed succinct analysis by the publishers. While I do not believe publishers should be the gatekeepers, I do not want to lose the analysis of data for the sake of speed of publication. 

The solution: most labs have university owned/manged websites where the Principal Investigator (aka PI, professor i.e. the person who's tuckus is on the line) owns the admin rights. These should become more of a real world home for the publication and sharing of lab data. It exists and some labs do a great job of updating it with content as it gets published. Everyone else needs to get on board bring it into the modern age with appropriate tagging and labels to ensure that it can be found through search engines.

PIs need to retain control, they are the ones that will be held accountable if the "published" result that shows a new cure for cancer that turns out a contaminant. The pervasive nature of the Internet means that the media has access to the data and need to hype themselves as getting the most interesting story. "Never let the truth get in the way of a good story" is a truism for more and more journalists. Accidents happen and while I'm alright with being embarrassed by my peers figuring it out, I'm not alright with it spinning into a worldwide story and having to explain that it was an "oops" to the public. 

Main point: Use the established website, each PI has admin rights to remove. Give senior lab members the ability to publish concise analysis with appropriate figures and links to the whole data set with metadata and clear descriptions. As part of the mentorship for new post-docs and graduates students training on what is considered an acceptable level of proof prior to making the data public. Laboratories are still training grounds, as someone who has trained students and post-docs the peer review process allows young scientists looking to move up to the majors an idea of what good science is-this cannot be lost by opening up publication. I love the citizen science movement but at the end of the day, like anything, there has always been a difference between someone who does something as a hobby and someone who has the discipline, passion and willpower to dedicate their life to a subject. Training and the culture of science has to be part of it-science is about justifying your opinion and the quality of data. 

Peer review isn't broken, the publishing models are, let's not throw the baby out with the bath water. The university controlled site allows for clear rules of engagement for pharma, media and allows for the level of control that PIs, chairs need to ensure that crappy data doesn't not spiral out of control into a scandal. The departmental chair and grant study groups can look at the metrics; website views, re-links, etc to allow flexibility into the systems for review whether it be tenure, grants or something else that no one has thought of. Open access will be a failure if it does not give everyone involved with the industry (yes its an industry! get over it people). It's not perfect but it can be piloted in a way that senior scientists from the core Cell, Nature and Science author pool can at least talk about. I think that many of the ideas that are being bandied about are much better than this for science as a whole and ultimately will be the long term solution. 

That being said I have yet to see an idea that any of the top level Cancer, Stem Cell, etc scientists will buy into. This is not a group to dismiss, they may not be the majority but they represent the main attraction for why scientists will not give up on the Elsevier or any other for profit publisher. They also are the presidents or senior leadership of some of the most influential universities in the world (Caltech, Rockefeller U., Memorial Sloan-Kettering, Max-Planck, etc). As a final reason to get them on-board they also are on the grant study groups and a variety of other activities that effect all levels of at least biomedical sciences.

At the end of the day the risk posed by publishing incorrect data needs to be balanced with greater access and conversation about what can be done next. Please comment as you see appropriate.


*Disclaimer-I only have experience with a limited number of institutes(eight) so I do not know if any of this is applicable widely. I have no idea if the issues that are bringing up are universal or limited to the institutes that I have worked at.