Showing posts with label healthcare. Show all posts
Showing posts with label healthcare. Show all posts

Monday 30 March 2015

It is not the EMR that sucks it is your lack of a information governance strategy

Hospitals are drowning is technological deficits; aging equipment, poor information security, unusable electronic health records and importantly no way to effective share patient records with patients or partners. A recent study shows that two out of three hospitals are not meeting HITECH standards for Health Information Exchanges. The authors note that even though there are fines (see here for HITECH fines), it is unlikely that these will spur adoption of health information exchanges (HIE).

Bernie Monegain (Editor Healthcare IT News) does a great job summarizing the article. From a software vendor perspective this seems like a perfect storm; a gap in capabilities, upcoming deadlines and a change in revenue models. In any other industry there would be lines of vendors at every hospital's CFO's door trumpeting their ability to help them meet their deadlines. Unfortunately the usual suspect vendors in HealthIT have not seized the opportunity (see here).

As I have mention before this is where ECM, WEM, etc vendors should be stepping in to fill the gap.Hospitals of all sizes will need to be able to confidently exchange patient information and make it available to patients once Meaningful use 2 standards for patient accessibility come into affect. Providibng a mechanism to share patient information in a standardized, secure manner is not a nice to have item, it is a required item to meet obligations-it should be on every hospital CIO, CFO and CEO's radar.

It also speaks to the larger problem of what electronic health records are strategically versus the narrow software characterization. Healthcare providers and thought leaders need to acknowledge the software sucks, and is not the best place to share and view information. It is just a dumb database designed to HOUSE patient information in a safe manner- as the name suggests a EHR is part of a records management strategy.

Electronic Patient information has the potential to increase the efficiency and cost effectiveness of healthcare delivery. The problem is the variety of solutions deployed by individual healthcare practices makes integration at the regional and national level difficult. As a rule they have been bought as point solutions to a immediate problem rather than as part of a healthcare information governance strategy.

It is time to look past a single solution that has a single set of technical specifications and build a system that manages data access.

As with any application rationalization process, it is important to define the costs, benefits and integration needs for any new enterprise application. Make no mistake; Health IT can no longer be a single application portfolio, they have to move to an ecosystem approach based on both clinical and administrative needs.

The failure of the single point solution of EHR/EMR has cause many IT professionals to take a negative view of information technology itself. As I have mentioned before, the problem is not the storage of the information it is how to access the information- it is a content management issue-be it ECM, WCM or -gasp-(SharePoint). EHR.EMR systems are horrible at providing access. For meaningful use 3 compliance and for your external marketing you need some kind of content serving system.

For organizations in a position to move to the newest EHR/EMR products, there may be no reason to have an additional system.For everyone who doesn't see a rip and replace in their next five years, consider how all the devices and partnerships that you have (and will have to grow to stay in compliance with Meaningful use).

You have a variety of regulatory items to think about as you develop your information governance strategy:

HIPAA 5010 covers Electronic data exchange(EDI[X12]) compliance standards as mandated for 1/1/2012: It covers exchange of all data transmitted by FTP, HTTPs, etc. Also encompasses the letter and number codes used for identifying file types during transactions. 5010 is largely an attempt to standardize the file codes in a way that increases security through in-flight encryption with de-crypt at each end. This is only possible if there is a standard metadata set.

ICD-10 is completely different it is the International Classification of Diseases (Rev. 10). This is used mainly for e-billing purposes as part of the diagnostic reference. It is not the official standard in the US until 10/1/2013, HIPAA 5010 EDI standards is a prerequisite for use of ICD-10.

Device access Smartphones and tablet computers represent the next wave of technological innovation in healthcare not to mention medical devices and consumer health apps (see here for more thoughts).

Mobile is a key aspect of your long term success. Hospitals have a variety of high earning "part-time" and ad hoc employees with their own businesses to run. You need a way to integrate their independent process and access into your secure information systems.

As with any access decision the type of information that can be accesses has to be balanced against the need for audit and security. The key is to remember the needs of end-users:
Doctors need access to all data, so restricting parts of the records is not an option.
Nurses need to update records on the fly.

IoT devices- There has been a lot of new devices for use in healthcare- patient owned health apps, mobile phones and wireless medical devices (see here for more on this). One of the key short comings of today's EMR/EHR products is their lack of abilities on the user experience front. Hospitals need to move away from single point solution planning for applications to a information management strategy that includes integration of outside data- whether it comes from patients, partner clinics or device vendors.

IT managers need to take the initiative and do these three things:

  1. Ensure that the process involves care providers and administration in the same room. These meetings cannot be for show. All decision makers must be involved. 
  2. Get to know who the key decision-making doctors are in each department and develop a relationship. Some doctors are in favor of EHR find out who these are in your hospital/clinic and involve them in building a strategy for how to attack the implementation. 
  3. Get care providers on-board during the demonstration phase. Take your key decision makers through the products ask questions about the mundane parts of the software (first impressions of the GUI, how to access the records) not just the big picture items.

Tuesday 24 March 2015

Supporting clinical research; a conversation with LifeQ

The age of biometric data is upon us, but the science is not ready to explain what the data means. In some ways it is really exciting, the potential is huge- even if we ignore the marketing materials and focus on the potential long term use of simple data collected under real world conditions. Stephanie Lee from buzzfeed had a sobering analysis of Apple's new Researchkit that the healthcare and clinical research value of the data is pretty much zero. I completely agree with (see my thoughts on Apple's foray into healthcare here). The only group that might see some value is the same group that has access to healthcare and quality jobs (see here for primary data from Pew Institute). This means that the biometric data pulled from "iEcosystem" will not reflect the population that acutely needs to be understood biometrically. (I'll provide a detailed example of the issues later in this blog.)
In my opinion; any data that is tied to a specific mobile device or "Internet of Things" (IoT) object is useless for healthcare unless it can be compared and combined on aggregate across devices and demographics.
It reminds of the mid-nineties when genomic sequencing was going to revolutionize healthcare and disease treatment. Twenty years later and we are finally realizing that the genome is an almost irrelevant piece. That the context of how that genome is read, acted upon by the cell, and communicated between cells is more important then any point mutations or small scale genomic changes. (I have written about thishere, in the context of cancer.)
The genomic age was necessary to spark the discoveries that are starting to change healthcare but the changes in healthcare won't be realized because of genomic biology. It seems to me that we are at the same crossroad with the Internet of Things (IoT). The technology is really cool and the visualizations are solid but......what do I do with it?
For example, I have a Fitbit it has literally change my activity due purely to trying to get to 10,000 steps......I think thats a good thing- I mean I lost weight, my back is better.....but I am left wanting more, what types of activity are related to my weight loss? Have I gone far enough-am I at lower risk for all of the things that I worry about from a health perspective?
I can tell you the data collected by my Fitbit is pretty useless to answer these questions. I downloaded it all ran it through a few different statistical models and guess what? None of it appears to be relevant to my on-going good health. I still use my Fitbit to track my activity but I have no illusions about the role that the collected data plays in my healthcare decisions.
I recently had a chance to talk to a really interesting start-up company called LifeQ. LifeQ (@LifeQinc) has restored some of my enthusiasm for IoT and real impactful changes in healthcare. LifeQ has taken a different approach to the internet of things. LifeQ owns intellectual property on for an optical sensor that uses light waves to penetrate the surface of the skin to monitor multiple biological measurables; heart rate, blood pressure, oxygen saturation, with other important measurables such as glucose in the beta testing phase. The real power of LifeQ is not the measurables. Most of the metrics that their sensor measures are relativelty common place. Many devices can measure heart rate, blood pressure, glucose, these are not unique. The true value of LifeQ as a IoT vendor is really in the predicative models and software that allows identification of changes in ones own biology. As Christopher Rimmer pointed out this very similar to the model that Google, Microsoft and Apple have pioneeered. LifeQ owns the core data acquisition ("OS") and the core platform for integrating and using the information ("search engine"). If LifeQ can be half as disruptive in healthcare as Google has been in mobile, they can be a driving force for systemic cost reductions and better treatment outcomes.

The device agnostic approach gives LifeQ a wide potential market in healthcare, fitness as well as the flexibility to weather the inevitable changes to the device ecosystem that end users are willing to use. The focus on data acquisition and analysis reduces the overhead and ensures that the width and breadth of data needed for accurate modeling can be gathered.
As LifeQ told me during our conversation "You can't build a great, high quality algorithm and data access AND build multi-functional devices at the level required to collect the data we need. There are plenty of companies in the health, medical and consumer device world with the pockets and desire to build high quality readers."
It is a really smart strategy, especially in the complex global healthcare and lifestyle market(s). Focusing on their strength and being choosy about the partnerships. This strategy allows LifeQ to ensure data quality and more importantly from a medical perspective, information security. High quality data that is combinable across devices is necessary to keep the predicative models relevant, and increase in accuracy with successive iterations.
Obviously the key risks are in how to ensure the partners continue to innovate on the physical devices and the integration of different device collected data into a single model. To keep with the Google analogy how do you build the back end to protect against the fragmentation of the device type when each device manufacturer has specific needs and market segments. The kind of companies that they are dealing with understand the necessity of spending on the hardware.
Not surprisingly the initial partnerships are consumer focused, within the personal potential niche, for example those that cater to extreme athletes. Some wider consumer focused. An interesting aspect will be how LifeQ can integrate the niche data into the predicative model without biasing against normal peoples fluctuations. For example, we know that part of what makes elite atheletes, well elite, is speed of recovery; their heartbeat decreases at rest faster, their rate of breathing decreases faster, muscles recover faster. So as LifeQ collects this data, what value does this data have for us "normals" will the models be accurate?
It is not an insurmountable challenge but the awareness of how the data can influence the model and vice versa is a concern for any IoT or Quantified Self technology. It is the early adopter problem, your initial feedback from fanboys and people who share your vision can blind to the general publics use cases and expectations. It is the exact problem that caused Google to shutdown Glass.LifeQ seems quite aware of the potential founder effect problems.
A more important (to me at least) is that they are also engaging the medical community to enable the kinds of use cases that have long term quality of life and better diagnostic test values for health monitoring. These kinds of markets are a growth market and can provide a reliable revenue stream. For example, at home monitoring or ambulatory care for basic monitoring of HR, breathing, Oxygen levels, blood glucose (coming soon). All of which can be monitored today by the LifeQ powered devices. The problem being that the current monitors that have the accuracy that LifeQ needs are cumbersome but they are easier to where than what most hospitals have- and can be worn for long periods of without patients being strapped to wires or stuck in bed. The potential for clearer test results under real conditions is tantalizing.
What is next?
Like all start-ups LifeQ is focusing on ensuring their product is the best by ensuring that every element that could negatively affect its core product. The really interesting piece will come from the meta-analysis once the number of users hits a large enough N to ensure predictability across populations.
LifeQ acknowledged the potential limitations of a optical sensor; skin color, lean muscle to fat ratios, as well as stability issues cause be user activity. They are working expanding the repetoire of sensors that LifeQ can collect data from as part of the platform.
The real issue that faces LifeQ and any of the more robust quantified self devices and analysis platforms really comes down to action steps. For that matter the same issue exists for personal genomics. What is the line between variation of the population and dangerous biometric signature? Is there more harm then good from telling folks everything?
LifeQ has a great platform, and appears to have all the pieces in place to be the "Google for healthcare." They certainly bear keeping an eye to see what they do next

Thursday 1 May 2014

The obligation of mHealth vendors to protect patient information

Lately I've been thinking about consumer focused medical devices. I am a Fitbit user and only every access the information on my cell. I do not actively share the information in their communities but I assume that Fitbit uses my information, in aggregate, to make money. I get it, they are an for-profit company and I am receiving a ultra cheap service, Fitbit needs to make money on that service somehow.

Now that I've got the niceties out of the way the rest of this blog is angry and rant-y. In terms of full disclosure some of my anger comes from the recent kerfuffle over General Mills' plan to treat social media as a binding contract to protect itself from litigation. The other part comes from my interactions with a couple of consumer focused but information sharing applications. One company provides a cloud based service that allows doctors and medical students to share patient information, including pictures, with other doctors. It is a great premise BUT......what protection is their for patients? 

I contacted the company and basically their protections are focused on their bottom line, the have a "policy in place that meets all legal obligations in their local jurisdictions"......they also would not disclose and had no plans to be proactive about applying any technical protections to block the sharing of patient data.

Am I the only person that has a problem with this?!? 

As we move forward into the wearables and internet of things era, what are the obligations to these companies? 

We hold customer facing companies responsible for protecting customer information. Shouldn't a company that provides a service that enables the sharing of medical information as accountable? Should they be allowed to merely point to a piece of "paper" and say "not our problem?!?" 

If your policy says that the user must comply with hospital regulations on patient data sharing, you should provide the hospital a method to enforce policy. As a patient I need to know that the med student is not sharing pictures of my serious and potentially embarrassing problem just to have a laugh with their friends. It is the reason that Box is such a fast growing product! It gives end users what they need and it gives the business the protections it may need. 

In this age of prism and companies selling your data (see here for stats), wouldn't it be a marketing advantage to tell customers you go beyond the minimal?

Here is my POV on this: If you are enabling sharing of a person's medical information, you are obligated to protect that data from the stupidity or laziness of your users. 

How many busy residents are really going to take the time to ask their patients if they can share the x-ray? Especially if they can control capture and share from a personally owned device? Should I as a patient be forced to spell out the conditions under which I will allow students and doctors to share my information? 

There should be no such things as Facebook, Dropbox or Google drive for doctors! At a minimum you should provide hospitals the option of enabling controls based on their policy and not just weasel out of it by throwing your hands up and saying hey we did our job, they told us that it was alright.

Tuesday 8 April 2014

Clinical data random information

I've become an information hoarder. As I spend more time thinking about Information Management and speeding the move to better technical systems, I am amazed how general the principals of design are between the different industries.

Here is a noobs (i.e. me) "plain spoken" understanding of a key term in managing patient data across hospitials and for predicative analytics and personal health decison making.

Level setting (i.e. in general the definition of Clinical data warehousing) Clinical data warehousing is a patient identifier organized, integrated, historically archived collection of data.

For the most part the purpose of CDW is as a database for hospitals and healthcare workers to analyze and make informed decisions on both individual patient care and forecasting where a hospital’s patient population is going to need greater care (i.e. patient’s are showing up as obese; therefore the need for specific hospital programs to fight diabetes are a good idea).

Data warehousing in healthcare also has use in preparing for both full ICD-10 and meaningful use implementation. For example; McKesson through its Enterprise intelligence module probably has plenty of CDW management capabilities the only interested in meeting the upcoming ICD-10 and meaningful use deadlines. These kinds of worries are only for US hospitals. However since Canada requires ICD-10 compliance for all EMR systems this does present a benefit to Canadian healthcare.

In principal since data warehousing at its core is about building a relational database and should be EMR supplier agnostic. Since McKesson is an ICD-10 and meaningful use- ready supplier, the database itself should conform to standards that would allow general solutions to be used. This article goes through some of the potential benefits and pain points. It is tailored to clinical trials but the underlying message that building a CDW is a ongoing procedure is the same for other uses.

One example of how this may be done is Stanford’s STRIDE; they used HL7 reference information model to combine their Cerner and Epic databases. This is part of a larger opensource project that may be an option if an organization has some development expertise.

Since the main user of CDWs tends to be the people doing the analysis (current buzzwords for search for analytics include:BI, Predictive analytics, enterprise planning, etc) it is probably useful for Health IT professionals to understand its WHO and WHAT the CDW is for within the organization...i.e. have a full blown Information Governance plan that places a value on information not just a risk assessment. 

Friday 28 March 2014

Security without usability isn't better healthcare

I spend a lot of my time understanding how information is stored, accessed and protected as part of my role as a IT analyst. I always am astounded at how little of what is standard practice in many industries as not filtered over to health care and/or life sciences (Pharma+Biotech+academia).

The recent hub-bub about ACA (AKA Obamacare) has completely yelled over the real transformation opportunity in healthcare. Up until the recent deadlines and political fights regarding ACA "everyone" was really concerned about meaningful use. The TL;DR version of the MU legislation is this: make information available to care providers and patients.

So what are we really talking about here? It is really pretty simple; it is information management and the processes that guard against mis-use while enabling productivity.

Lets be honest the EHR/EMR solutions implemented at most organizations do not enable productivity or protect information. Doctors hate them because they do not fit their work patterns (see here), hospitals are have significant issues with data protection (see here) and importantly it is not mitigating the biggest risk to patient outcomes (and hospital liability) (see here).  

It is time to re-think the information silos in healthcare.

So if a single poorly accessed EHR is not the answer, what is?

I would argue that we need to think about this based on information flow and how we expect the value to be delivered. In this case patient care.

An interesting model to think about is the Canadian delivery model. For example; Ontario E-health has determined it is neither cost effective nor timely to build a single system for every hospital.  At the moment, 70% of all physician practices and hospitals already have some sort of EHR system in place. So rip and replace is not an option, the reality is we need to make lemonade.

Since Ontario funds the hospitals through direct allocation of tax revenue, it is loathe flush that money down the drain. 

Therefore the best approach is to control the data itself (including digital images, prescription history, surgery, etc) and letting the individual hospitals control how they view and use the data. 

In other words- Make it easier to access information based on who you are and what you need the information for!

Focus on the Information exchange layer

Consolidated Information Management layout for Patient care focus. 
So how do we do this without moving to brand new systems and shiny new toys?

The same way every other industry is doing it; especially low margin high risk industries such as Oil and gas, Insurance and Manufacturing. Keep the clunky but very secure system and take advantage of the new technologies that enable information sharing. Instead of all-in-one solution add an ECM or portal to manage rights, search and presentation. It will be more cost-effective than doing nothing or rip and replace.

This structure controls movement and access to patient data, allowing for quick access to the appropriate information based on job and location.  It provides a structure that takes advantage of the current investment in a secure database yet provides a flexible layer that is designed to convey information in context for end users. 

This may not be the best system or the system that you would design from scratch with an unlimited budget, but it gives a long term flexibility AND doesn't require a rip and replace of your current EMR/EHR. It should provide very good, highly usable healthcare at a reasonable cost.

The way they are going about the change may not be splashy but it will work for both patients and doctors- that’s a great thing. The one thing it won’t fix is the doctors who refuse to use it-and that is a bad thing.

There is additional cost involved in this model but if teh doctors and nurses do not use what you have now.....would salvaging that investment be better?

Love any comments or critique of the model.

Saturday 3 November 2012

Funding research in the new (poorer) world


The world has changed. Money is tight, the large foundations and governmental granting agencies are risk averse........which means senior scientist will get the $. 

This means that innovation is going to die. Once you get to be a senior scientist you don't have time to fail you have to feed the beast. I think the best allegory is Wall St.- Yes "to big to fail, I need a bail-out Wall St." is exactly what most large labs in the world are!

Rather than launch on some Quixotic diatribe about how bad this is for health and science as endeavor,  Ill just talk about how to make it irrelevant.

The answer is small foundations. They have the focus, passion and community to start a real long term relationship with young scientists. Brand new, too ignorant to know better scientists who got their own lab by being the most innovative and best prepared post-doc is exactly the one who will take the chance-if their is money involved. 

All foundations seem to be focusing on drug discovery and biomarkers. This is a great space for smaller disease focused foundations to occupy. The problem is that most of the organizations do not have the gobs of money to follow it through in a comprehensive manner.


Getting effective novel therapeutics requires engagement of talented, creative scientists


For disease focused research foundations this can be difficult. In the current economic environment foundations must have some method to "hook" scientists whether it be large per year grants, limited restrictions on spending or speed of review. 


For rare diseases this means getting in their early phase when they shape and limit the vision of their lab. Rare diseases research requires passion and a way to find general funding that will maintain the lab. 


In this day and age when peer based mentoring is sadly lacking a foundation that can provide some guidance on where/how their researchers can leverage funding and expertise can gain loyalty and expand by word of mouth. This will then lead to larger "sexy" studies and fund-raising. 


This kind of thinking can work hand in hand with maximizing fund raising as you can tell fund raisers that a large portion of their money goes directly to attempting to cure or ease their disease rather than greater good. 


Overall:Small foundations should look to maximize the effect of their funds to effect change in their specific disease. 

Through targeting 2 areas of research:
1 Cellular characterization of the disease (ie what are properties that are different between normal & disease)
2 Drug/therapeutic design and testing-no matter how speculative. This assumes that the idea or test system can pass peer review 


The 2 areas would have separate competitions, 2 separate funding paradigms:

1 Short funding cycle-a micro-finance model. Short grants with quick turn around. 2 year grant with  a hard progress report with mutually agreed upon measurable progress. 

2 A prestige grant larger "no questions asked" funding 5 year funding with no reporting for 2 years. Again mutually agreed upon defined goals that MUST be met to receive final 3 yrs of $    


The grants would be open academia and small biotech. There would also be a bonus for academic lead Pharma-academic RFPs. There would be a significant and clear partnership NOT just "in-kind" contributions. 


The research and fund-raising would have a high degree of back and forth. The foundation would hold a stakeholders conference where selected funded scientist would come and explain the state of research in layman's terms. 


There has to be greater out-reach from the scientists at foundations. The Office of the CSO should engage in various forms of social media to engage and find funds (with the guidance of the Exec board). This can no longer be left to that summer intern who just finished Bio 101. The public is too smart for that and frankly if I was looking for a small foundation for funding I want to know that the scientists are engaged. It should be an expectation not just a hope that scientific merit is judged by scientists. 

Sunday 16 September 2012

Time for some convergent evolution in knowledge management

As I move from the ivory tower of Neuroscience to the practical, business related advice that Info-Tech gives clients on their IT environment I'm amazed at how many parallels I see in the needs and the solutions in all kinds of human endeavours.

For example, I just finished talking to a vendor about how Enterprises can manage and maximize their content (images, documents, blogs, etc). Much like my own thinking on this, @OpenText is convinced the core issue is about information movement not what it is stored in (i.e. a word doc VS a excel).

For me this comes back to a practical problem that I had as a graduate student. My Ph.D was on how gene expression relates to brain development. The brain develops in a fascinating manner; it starts out as a tube that grows outwards at specific points to build the multi-lobed broccoli-esque structure that allows all vertebrates but particularly mammals to have diverse behaviours and life long learning.These complex behaviours rely on an the immensely diverse set of brain cell types. Not only is their great diversity of cells but each cell needs to get to the right place at the right time.

Think of a commute on the subway; not only do you need the right line but if you don't get to the station at the right time you won't get to work on time. This could lead to you getting fired. For brain cells this could lead to death. For the organism it could mean sub-normal brain function-and potentially death. The fact that the process works is a testament to the astounding flexibility and exception management built into cells by their epigenetic programming.

There is however one big problem with the type of brain development: the skull. The skull limits the number of cells that can be created at any given time. Practically this means that the level of control that must be exerted on the number of any one cell type is very tight.The control comes from coordinating which genes are expressed in each cell type to allow cells to make decisions on the fly. Usually it starts by the brain cells take off in a random direction that then informs them of what type of brain cell they will end of being when they arrive. The cells then proliferate as they move based on the contextual information that they receive about how many more cells are needed. This all happens through cell to cell communication and rapidly changing patterns of gene expression.

(Wait for it Ill get back the parallel problems honest....)

As you can imagine this was (and still is) a daunting problem to investigate. My research involved a variety of time staged images; reams of excel workbooks on cell counts, brain size; word docs on behaviour and whole genome expression sets. It was the a big data problem before the phrase existed. (Business parallel No.1). In reality I had no problem keeping track of all this data and looking at each piece and doing the analysis on each piece. I had very good notes(metadata) and file naming conventions (classification) to ensure that I could easy find the file I needed. I was in effect a content management system (Business parallel No.2).  The problem was synthesizing the separate analysis into a cogent piece of information i.e. something that can be shared with others in a common language that allows other to build their own actionable plan. (Business parallel No.3).

Any scientist reading my dilemma from 15 years ago can probably relate-and so can anyone else that uses and presents information as part of their job. The reality is that technology can only solve the problem if people recognize the problem and WANT to be systematic in their habits.......the will power to be repetitive in their approach to work is sorely lacking from most knowledge based workers. Ironically a lack of structure kills creativity by allowing the mind too much space to move within. The advent of the online databases by NIH from genomic, chemical and ontological data has given a framework for scientists to work within to quickly get up to speed in new areas of investigation. Unfortunately this has not trickled down to individual labs (again more proof that trickle down anything doesn't work effectively-its just not part of human nature).

This lack of shared framework across multiple laboratories is becoming a real problem for both Pharma and academia (and everyone else). The lack of system has led to reams of lost data and the nuggets of insight that could provide real solutions to clinical problems (Business parallel No. 4). This also leads to duplication of effort and missed opportunities for revenue(grant) generation.(Business parallel No.5).

From a health perspective, if we knew more about what "failed drugs" targeted, what gene patterns they changed and what cell types they had been tested on we could very quickly build a database. From a Rare disease perspective the cost of medical treatment is partially due to the lack of shared knowledge. How many failed drugs could be of use on rare diseases? We will never know.

This is a situation where scientists can learn from the business community for the technical tools to really allow long term shareable frameworks. These technical controls are available at any price. Conversely the frameworks and logic that scientists use to classify pieces of content to link them have lessons for any knowledge worker.

Its time for some open-mindedness on both sides, the needs for all kinds of organizations and workers are converging-too much data, too many types of data, not enough analysis. Evolution is about taking those "things" that work and modify them for the new environment.


Saturday 9 June 2012

Health IT and clinical research

I recently returned from the E-Health Canada conference in Vancouver. I was there as an analyst for Info-Tech research group. I spoke about secure use of consumer devices in healthcare and the potential of cloud computing as a flexible model to deal with CoIT.


I was pleasantly surprised by the IT knowledge level of the nurses and doctors that attended the conference. Why was I surprised in this age of consumer devices and self service tech? Well I spend a fair amount of my time talking to and about IT departments. The impression that I get from many of them is that while most people can set up their email on their phone they remain largely clueless about the actual tech itself.  


Well, there is certainly a core of nurses and doctors that understand the tech and have really great ideas for how to make it work better in the context of healthcare delivery. I was left with the feeling that many remain frustrated with the current solutions and the pace that E-health is moving forward. The major areas of frustration were around content delivery and system upgrade for usability. I would summarize it as “You must do something; protect the data but be flexible on the device used”. Technology should allow doctors to spend more time looking and talking to patients not with their nose buried in a tablet. Just placing a tablet in the doctors’ hands doesn’t mean it will lead to increased use of the IT solutions for healthcare.


While some may point to this as a technology problem, it was clear from talking to the vendors that the solutions available today can meet the demands that hospitals are placing on IT. As someone with a past on the research and has dabbled in clinical research it was refreshing to know that there is a variety of solutions out there to make clinical research easier. What was interesting was how similar healthIT problems are to many other industries. In my opinion the issue is now about getting the best out of the tech not when will the tech exist.


 In other words its about getting the users and the admins on the same page. 


As someone with a deep passion about Rare diseases and use of high level biomedical research its somewhat frustrating that the system that is in use today is so antiquated. The upgrades available today could add so much intelligence to how we treat Rare diseases in particular. 


The new areas of stem cell therapy and epigenetics hold a HUGE promise if we can understand the relationship between disease and biology in these patients. Since they are so rare at any given location it is imperative that we have a way to share the data that is safe enough to share patient history across borders.

Wednesday 29 February 2012

Rare diseases and open access.


Ive found myself completely mesmerized by the open access/open science debate. As a recovering bench scientist, it has made me think about a variety of things but one that is really interesting is the implications for Rare disease research and speed of turning great benchwork into viable drug targets. Ill deal with the larger debate on open access separately but I wanted to put forward something today(Feb 28th 2015 Rare disease day). 

2016 update: In my opinion not much has change in Rare diseases in the last year. There have been some moves forward but like anything in the drug and/or therapeutic research- it is time consuming. I hope that the silence is because we are getting to the point where folks have rolled up there sleeves and are working not talking. 

I am really excited about the prospects for increasing the speed that potential drug targets can go from bench to bedside. The new technologies (gene sequencing, clinical data) can provide faster turn around time through efficient data sharing and new genomics technology. The real potential pay off is through new clinical data that will be available once EMR is implemented widely. The value of that much data combined with the new genome sequencing technologies can really provide some much needed guidance about the genotype phenotype relationships that may link certain rare diseases. I say may since it will really come down to data quality and wide dissemination of that data. Getting clinical data into the hands of molecular biologists and biochemist who can do the bench research is vital to drug design. 

2014 Update: With the roll-out of ACA starting to happen and the FDA crackdown on 23andMe. The landscape for studying and curing Rare Diseases just got a little better. For more information on the 23andMe nonsense there is plenty of information on the imbroglio but this one from the Huffington Post is the least sensationalist. My opinion is that the FDA made a decision based on the specific businesses lack of response it is not an indictment of consumer genetics or any paternalistic over-reach. Mathew Herper has a really great analysis of the stupidity and or hubris that 23andMe showed.  The Global Genes Project has a nice blog on the relationship between Rare diseases and ACA. 

The bad news is that the sequester has set research back years if not decades and may have very well rob a whole generation of scientists of their careers (this author included). Tom Ulrich of Boston Children Hospital has a nice blog on the subject.
\
2015 Update: The new interesting initiative is precision medicine in US. I really am proud of the way Global Genes Project is coming along. I was briefly involved when Nicole Boice started the initiative. I look forward seeing how it continues to grow in 2015. I think it does a great job of keeping the conversation on awareness and providing a site to aggregate "best practices" for the Rare disease community. 

I really hope that the continued access to healthcare that we started to see in 2014 continues. The key will be what do we do with the basic information that clinics gather about their rare diseases patients? How do we make that shareable across clinics. This in my opinion will be the key to consistent diagnosis and clear symptons, which will then better inform scientists of which genes contribute to the phenotype. This is the basis of drug discovery and treatments. 

Right now is a real nexus of information due to the convergence of new technologies with "new" fields of studies. Epigenetics is the study of how and why genes get turned, the best analogy is: if the whole genome is the book of life, genes are the words and epigenetics is the sentence, paragraph and chapter structure that gives the words meaning. The other area is off-shoot of stem cell research; induced pluripotent stem cells (iPSCs). iPSCs as the name implies are induced to become stem cells from a variety of other cell types, the most clinically relevant being skin and blood. While the debate rages iPSCs and their value for replacing non-working cells with new ones [regenerative medicine] one thing that is not in doubt is the power of these cells for modeling disease. iPSCs can be made from patient samples and then shared with other researchers. This may seem trivial but the more people looking at the same model the quicker the core problem can be found. If done right the sharing of the iPSCs to researchers who use different techniques (biochemists, molecular biologists, cancer, etc) will provide a 360 degree view of the disease. 

Update 2014: Unfortunately it seems that iPSC research is becoming marred with scandal. The new "most promising" discovering may be "less real" than one would hope.....Paul Knoepler has a blog on the subject. BTW if you have any interest in stem cells you should follow Knoepler's blog he is an excellent writer and a top notch scientist.

Update 2015: I think we are past the really bad period, unfortunately it has also diminished the enthusiasm for iPSCS as models. Although I am not surprised, it has recently been shown that iPSCs form different sub-types of cells based on their tissue of origin (see here for neural and here for heart). This seriously limits the usefulness of iPSCs for drug discovery and would just exacerbate the reproducibility issues that are plaguing science in general but particularly stem cell research. 

Once this happens it's likely that links will be found that can make drug discovery and testing palatable for biotech and big pharma. Drug discovery is expensive but if the community can gather enough information about the molecular and biochemical characteristics of rare diseases then the existing "orphan drugs" can be tested against the characteristics rather than any single disease. 

Update 2015: The orphan drug area is one where we are starting to see movement. The recent announcement by the CF foundation recieving $3.3B for the patent rights to Kalydeco. It is an interesting approach that should be considered by any rare diseases group looking to expand support and the potential therapies for their disease. 

As always the caution is who should get the money, how do you ensure that the cost of the drug to sufferers is appropriate? If the foundation funds the study (in part) do they have an obligation to ensure that the cost of teh therapy be reasonable to the average person?

The elephant in the room is of course paying for all of this. Scientists need to be able to publish to get grants to pay for post docs and reagents. While there is some money available from disease foundations but it doesn't cover all the costs that a lab needs to run. That is the job of the NIH. However their mandate really requires that grants are given out based on WIDE applicability of the research and the grantee's history of research in that area. Unfortunately this model does not serve the rare community very well nor does it foster the wide range of scientific endeavor. There hundreds of examples where a rare disease has lead to unique insight into a biological pathway that was key to some cancer or other disease. 

Update 2014: Rare disease research will survive but we need to start to fast track new funding models that focus on highly innovative projects. We know what hasn't worked we need some research that is different.

Update 2015: Unfortunately I can't say there has been too much movement on this. Frankly scientific funding is horrible right now. I think for rare disease foundations there is an opportunity to foster young scientists to be advocates and invested in their disease but this requires a new way of thinking about how to fund rather then WHAT to fund.