正在加载图片...
1 Introduction structured repor reation and use of controlled vocabularies/ontologies to describe image findings: development of medical natural language processing were all pursued within as aids towards being able to search and index textual reports (and hence the related imaging). Though great strides have been made in these areas, research efforts are still very active: within routine clinical care, the process of docu menting observations largely remains ad hoc and rarely meets the standards associated with a scientific investigation, let alone making such data"computer understandable. Understanding Images: Todays Challenge The modern use of the adage, "A picture is worth ten thousand words, "is attributed to a piece by Fred Barnard in 1921; and its meaning is a keystone of medical imaging informatics. The current era of medical imaging informatics has turned to the question of how to manage the content within images. Presently, research is driven by three basic questions: 1)what is in an image; 2)what can the image tell us from a quantita ive view; and 3) what can an image now correlated with other clinical data, tell us about a specific individuals disease and response to treatment? Analyses are looking to the underlying physics of the image and biological phenomena to derive new knowledge, and combined with work in other areas(genomics/proteomics, clinical informatics), are leading to novel diagnostic and prognostic biomarkers. While efforts in medical image processing and content-based image retrieval were made in the 1990s(eg, image segmentation; computer-aided detection/diagnosis, CAD), it has only been more recently that applications have reached clinical standards of accept ability. Several forces are driving this shift towards computer understanding of images: the increasing amount and diversity of imaging, with petabytes of additional image data accrued yearly; the formulation of new mathematical and statistical tech- niques in image processing and machine learning, made amenable to the medical domain; and the prevalence of computing power. As a result, new imaging-based models of normal anatomy and disease processes are now being formed Knowledge creation. Clinical imaging evidence, which is one of the most important means of in vivo monitoring for many patient conditions, has been used in only a limited fashion(e.g, gross tumor measurements) and the clinical translation of derived quantitative imaging features remains a difficulty. And, in some cases, imaging remains the only mechanism for routine measurement of treatment response. For example, a recent study suggests that while common genetic pathways may be uncovered for high-grade primary brain tumors (glioblastoma multiforme, GBM), the highly hetero- geneous nature of these cancers may not fully lend themselves to be sufficiently prog- nostic [17]; rather, other biomarkers, including imaging, may provide better guidance In particular, as the regional heterogeneity and the rate of mutation of GBMs is high [13], imaging correlation could be important, providing a continuous proxy to assess gene expression, with subsequent treatment modification as needed. In the short-term, the utilization of imaging data can be improved: by standardizing image data, pre-and post-acquisition(e.g, noise reduction, intensity signal normalization/calibration, con- sistent registration of serial studies to ensure that all observed changes arise from physiological differences rather than acquisition); by(automatically) identifying and segmenting pathology and anatomy of interest; by computing quantitative imaging features characterizing these regions; and by integrating these imaging-derived fea tures into ac ehensive disease model1 Introduction 11 image findings; and the development of medical natural language processing were all pursued within radiology as aids towards being able to search and index textual reports (and hence the related imaging). Though great strides have been made in these areas, research efforts are still very active: within routine clinical care, the process of docu￾menting observations largely remains ad hoc and rarely meets the standards associated with a scientific investigation, let alone making such data “computer understandable.” Understanding Images: Today’s Challenge The modern use of the adage, “A picture is worth ten thousand words,” is attributed to a piece by Fred Barnard in 1921; and its meaning is a keystone of medical imaging informatics. The current era of medical imaging informatics has turned to the question of how to manage the content within images. Presently, research is driven by three basic questions: 1) what is in an image; 2) what can the image tell us from a quantita￾tive view; and 3) what can an image, now correlated with other clinical data, tell us about a specific individual’s disease and response to treatment? Analyses are looking to the underlying physics of the image and biological phenomena to derive new knowledge; and combined with work in other areas (genomics/proteomics, clinical informatics), are leading to novel diagnostic and prognostic biomarkers. While efforts in medical image processing and content-based image retrieval were made in the 1990s (e.g., image segmentation; computer-aided detection/diagnosis, CAD), it has only been more recently that applications have reached clinical standards of accept￾ability. Several forces are driving this shift towards computer understanding of images: the increasing amount and diversity of imaging, with petabytes of additional image data accrued yearly; the formulation of new mathematical and statistical tech￾niques in image processing and machine learning, made amenable to the medical domain; and the prevalence of computing power. As a result, new imaging-based models of normal anatomy and disease processes are now being formed. Knowledge creation. Clinical imaging evidence, which is one of the most important means of in vivo monitoring for many patient conditions, has been used in only a limited fashion (e.g., gross tumor measurements) and the clinical translation of derived quantitative imaging features remains a difficulty. And, in some cases, imaging remains the only mechanism for routine measurement of treatment response. For example, a recent study suggests that while common genetic pathways may be uncovered for high-grade primary brain tumors (glioblastoma multiforme, GBM), the highly hetero￾geneous nature of these cancers may not fully lend themselves to be sufficiently prog￾nostic [17]; rather, other biomarkers, including imaging, may provide better guidance. In particular, as the regional heterogeneity and the rate of mutation of GBMs is high [13], imaging correlation could be important, providing a continuous proxy to assess gene expression, with subsequent treatment modification as needed. In the short-term, the utilization of imaging data can be improved: by standardizing image data, pre- and post-acquisition (e.g., noise reduction, intensity signal normalization/calibration, con￾sistent registration of serial studies to ensure that all observed changes arise from physiological differences rather than acquisition); by (automatically) identifying and segmenting pathology and anatomy of interest; by computing quantitative imaging features characterizing these regions; and by integrating these imaging-derived fea￾tures into a comprehensive disease model. structured reporting; the creation and use of controlled vocabularies/ontologies to describe
<<向上翻页向下翻页>>
©2008-现在 cucdc.com 高等教育资讯网 版权所有