METHOD AND SYSTEM FOR IMAGING AND ANALYSIS OF ANATOMICAL FEATURES
A method and system are provided for characterizing a portion of biological tissue. This invention comprises a smartphone and tablet deployable mobile medical application that uses device sensors, internet connectivity and cloud-based image processing to document and analyze physiological characteristics of hand arthritis. The application facilitates image capture and performs image processing that identifies hand fiduciary features and measures hand anatomical features to report and quantify the progress of arthritic disease.
This application is a continuation of U.S. patent application Ser. No. 14/702,570 filed 1 May 2015, which claims benefit of U.S. provisional patent application No. 61/988,002 filed 2 May 2014. Each of the foregoing applications is hereby incorporated by reference in its entirety for all purposes.
FIELD OF THE INVENTIONThe present invention relates to methods and systems providing medical equipment for diagnosis, analysis and the monitoring of treatment and detecting, measuring or recording devices for testing the shape, pattern, colour, size of the body or parts thereof, for diagnostic purposes.
BACKGROUND OF THE INVENTIONArthritis is one of the most common health problems affecting people throughout the world. Hand arthritis primarily affects the articulating joints of the hand and can cause pain, deformity and moderate to severe disability. Hand arthritis is actually many diseases but is grouped into two main types; osteoarthritis (OA) and inflammatory arthritis (IA), (including rheumatoid arthritis). Typical symptoms of hand arthritis are joint swelling and pain. While radiographic features of osteoarthritis are found in 67% of women and 55% of men 55 years and older, symptomatic osteoarthritis is less prevalent.
Recent studies have shown that erosive osteoarthritis of the interphalangeal (IP) joints is an important subset of osteoarthritis because it causes significant functional impairment and pain. While not as severe in terms of pain and disability as inflammatory arthritis, painful erosive osteoarthritis has a greater impact in the general population. One of the common features of symptomatic erosive osteoarthritis is inflammatory episodes in the early course of the disease that result in swelling and tenderness and this condition is sometimes referred to as inflammatory osteoarthritis. This swelling and tenderness manifests in the nerves, blood vessels and supporting matrix that supplies the synovial membrane that encapsulates the joint and produces the synovial fluid that lubricates the joint. It can be assessed by visual observation and palpation, and by quantitative measurements of grip strength.
Many research reports have attempted to quantify and correlate radiographic measurements, functional measurements and patient questionnaires. Treatment remains primarily palliative with very few surgical interventions, such as interphalangeal joint replacement or fusion. Symptomatic rather than radiological presence of osteoarthritis remains the primary indicator of the need for intervention, in most cases by pain control medication.
There have been a number of research initiatives to use optical methods to analyze interphalangeal joint disease including optical coherence tomography, diffuse optical tomography, laser trans-illumination imaging, photoacoustic tomography and digital imaging of both hands and radiographs. In many areas of disease the understanding of the interaction of light and tissue and its application in diagnosis has expanded rapidly. These techniques have historically required specialized equipment for measurement and interpretation.
With the advent of wireless mobile computing devices such as smartphones and tablets this constraint is rapidly changing. Mobile devices are becoming part of the health care ecosystem and applications for smartphones and tablets are proliferating rapidly. The use of imaging and other sensors in smartphone applications is now common and available for the majority of the population in the developed world and many in the developing world.
Coincident with universal deployment of smartphones, is the development and ongoing standardization of the electronic health record as well as the evolution of legislative guarantees of personal access to health records and privacy requirements for agencies transmitting and using electronic health records. These provide ways in which an individual can now have greater autonomy in how they engage with their health providers or payers and have access to their health records. This has also resulted in the evolution of the personal health record services now offered by major telecom and software companies, including Microsoft.
Active patient participation in the management of their disease has been shown to reduce the perceived pain and disability and provide a greater sense of well-being.
It is a goal of this invention to provide individuals who may be developing or have developed arthritis, digital tools to assess and monitor the progress of their disease using their smartphone or tablet as a mobile medical device.
SUMMARY OF THE INVENTIONThis invention comprises a smartphone application that allows an individual concerned about or experiencing the symptoms of arthritis to use their smartphone to collect information and to make measurements of their hands. This information can be analyzed to identify changes in the anatomy of the hand that are inconsistent with normal expectations and to track these changes over time. This application is intended to collect sensor data from the smartphone and to analyze and correlate this with biographical information, experiential measures of pain and movement, medication use, weather and regional demographics. It is intended to integrate with existing health record systems compliant with the ISO/IEEE 11073 standards, meet HIPA/HIPAA and other privacy standards and connect to personal health records, like Microsoft Healthvault.
While this invention describes measurement of the hand it will be readily understood that the invention can be used to measure a range of anatomical features including such features as the foot, the leg, the knee, the shoulders or the whole body and any sub feature of an anatomical feature such as a wound, lesion or skin area exhibiting discoloration indicative of a disease or trauma. The body or anatomical feature measured need not be human. For example it could be the body of a mouse, dog or other animal.
The invention comprises a mobile app on a smartphone that collects basic biographical information, captures and calibrates images of the hand, performs anatomical analysis of the calibrated hand image to identify key fiduciary features, make measurements of the hand anatomy and reports and tracks these measurements over time. In some embodiments of the invention the data is transferred and stored on a cloud database server connected wirelessly to the smartphone. In some embodiments of the invention the calibration and analysis of the data is performed by software deployed on a cloud processing server connected to the cloud database server. In some embodiments of the invention the analyzed data and reports are transferred to a personal health record system on a cloud database server. The analysis will identify key features of hand arthritis such as the presence and location of Heberden or Bouchard nodes, angular deviation of the phalanges at the interphalangeal and phalange-metacarpal joints and other characteristic features of osteoarthritis or inflammatory arthritis. Individuals may provide their personal physician, or other health providers, access to this information via their personal health record.
In some embodiments of the invention the invention the method will incorporate biographical and environmental data into the database and analyze these to provide graphical reports of correlations between individual pain, hand appearance, weather, location, age, and gender, and comparison to typical expectations of those who are without symptoms comparable symptoms, etc.
It is to be understood that this summary is provided as a means for generally determining what follows in the drawings and detailed description, and is not intended to limit the scope of the invention. The foregoing and other objects, features, and advantages of the invention will be readily understood upon consideration of the following detailed description taken in conjunction with the accompanying drawings.
A concise detail of the mechanism of the system is documented below.
The following section provides definitions for terms and processes used in the technical description.
A ‘Cloud Server’ is a virtual private Internet server that enables users to install and run applications, maintain databases and communicate with external input/output devices much like a physical server. It offers flexible, fast outward scalability for operations that is not offered by physical servers.
A ‘Cloud Processing Server’ is a Cloud Server equipped with sufficiently powerful central processing units (CPUs) and available memory and that functions primarily to process or analyze information, for example, complex image processing.
A ‘Cloud Database Server’ is a Cloud Server that functions primarily to store and retrieve data that can then be processed, analyzed or reviewed, typically after being transferred to another computer system.
A ‘mobile application’ is a software application that runs on a mobile platform environment such as Android, Apple iOS or Windows mobile deployed on smart phones and tablets.
An ‘electronic health record’ is a digital record of patient and physician information that is can be shared across different health care settings.
A ‘Hough transform’ is a technique that uses voting procedure on parameter space to extract features of an object, in this case long straight lines and lines that form a large blob.
An ‘affine transformation’ is a geometric transformation that preserves the ratio of distances between points that lie on a straight line. This technique will be used to correct distortion and warping of objects in an image.
‘K-means clustering’ is a vector quantization method used to cluster observations into groups of related observations.
A ‘boundary pixel’ is an image pixel that represents an intensity and coordinate on the traced boundary of an object in the image.
A ‘Heberden node’ is a bony swelling that develops on distal interphalangeal joints.
A ‘Bouchard node’ is a bony swelling that develops on proximal interphalangeal joints.
A ‘fiduciary point’ is the representation in image coordinates of anatomical features including fingertips, the vertices between the fingers, the joints of the fingers and similar features.
TECHNICAL DESCRIPTIONThe invention comprises a mobile device such as a smart phone or tablet with Internet connectivity, a mobile application installed on the smart phone or tablet and software to process data provided from the smart phone or tablet to the processing software. In a preferred embodiment of the invention, the processing software is installed on a Cloud Server. In another embodiment of the invention, the processing software may be installed on the mobile device. At this time, the processing capability of mobile devices is insufficient to provide sufficient processing capability for some applications. For those applications where the processing capability of the mobile device is sufficient, data processing may occur on the mobile device. In a preferred embodiment of the invention, the method comprises capturing images of the hand using the mobile app on the smart phone and uploading the images to a cloud server for storage and processing.
The invention comprising the mobile device, the mobile application, the cloud data processing server, the cloud data processing software, the cloud database server, the electronic health record software, and the secure communication software is collectively known as the system. The front-end of the system comprises the mobile device and the mobile application, which provides an interface for the user to capture and input images and other data, and provides an interface to review past reports and analyses. The front-end may further comprise a mobile application providing a connection to an electronic health record where user information can be stored.
The back-end of the system comprises the Cloud Processing Server, the data processing software, the Cloud Database Server, and the electronic health record software. The complexity of the data processing software currently requires code structure that cannot be deployed natively on all smart phone environments in a consistent manner. Therefore, it is an advantage of the system to use a Cloud Processing Server to ensure consistency of data processing throughout many mobile platforms and to provide streamlined performance. The Cloud Database Server hosts the electronic health record software and associated databases storing each unique user's data and images, and interfaces with the cloud-processing server. An advantage of deploying both the database and the data processing software on a cloud server ensures that the system operates under a low latency of communication between the data processing server and the database server, providing a faster response time for communicating results to the mobile device. A further advantage of cloud servers is that they provide a deployment environment that is easily scalable for high growth and a secure framework for sensitive patient data.
Turning to the figures, FIG.1 provides an example of a user taken image that can be processed by the system. A white paper background of known dimensions [10] is placed beneath the hand [20]. The image capture device is oriented so that the orientation of the rectangular paper [10] is approximately the same as that of the image sensor and hence the captured image. Both the paper and hand are preferably within the field of view and the middle finger of the hand is inline with a major axis of the paper [30]. This is the preferred orientation, placement and field of view for image capture and subsequent processing.
The method further comprises collecting the locations of the fiduciary points and the measurements of anatomical features determined using the method as a data set that can be compared from time to time to determine changes in the anatomy of the hand that may indicate disease progression, healing or other changes that may be diagnostically useful. The method can further comprise collecting sensor information from the smartphone comprising at least one of geographic location, time and date, ambient light levels, smartphone camera settings and characteristics and correlating these with the measurements as part of the data set. The method can further comprise correlating the geographic location and time and date with external databases containing weather data, population statistics such as mortality, disease incidence and similar measures and correlating them with the image analysis. The method can further comprise collecting biographic information from the subject comprising at least one of age, gender, disease status, pain status, medication status, medical history or other useful biographic variables and correlating them with the image analysis.
While the foregoing description of the methods is directed to imaging of the hand for diagnosis and monitoring of hand arthritis and other diseases, it is obvious for one skilled in the art that the method is equally applicable to diagnosis and monitoring other human anatomy such as the foot, the arms, the legs, and as well as the whole body. In the case of images of the whole body, where the paper reference object may be too small, a substitute reference object such as a door or wall poster of known dimensions may be used as the reference object. In some embodiments of the invention, other reference objects may be preferred, including smaller cards, business cards or coins or paper currency or other useful objects.
While the description of the methods describe above refer to analysis of two dimensional images it is also obvious that the method is not limited to two dimensional images but may be applied to three-dimensional images such as those captured using magnetic resonance imaging laser scanning tomography, multi-angle imaging reconstruction or any other method of creating a three dimensional image of an object. In this case where a three dimensional object is used the boundary of the hand would no longer be a two dimensional linear array of pixels, but a three dimensional surface comprised of voxels.
Claims
1. A system for measuring anatomical features of the hand comprising: an image capture device capable of capturing a color digital image of a splayed hand disposed in front of a known background object, a system processor for analyzing the digital image to locate anatomical features and background features, determining the dimensions of the anatomical features, and providing a report on the anatomical dimensions that can used to assess the condition of the hand, and a data repository for storing the information as an electronic health record.
2. A method for measuring anatomical features of the hand comprising: capturing a color digital image of a splayed hand disposed in front of a known background, analyzing the digital image to locate anatomical features and background features, determining the dimensions of the anatomical features, and providing a report on the anatomical dimensions that can used to assess the condition of the hand.
3. The method of claim 2 where the condition being assessed is arthritis of the hand.
4. The method of claim 2 where the condition being assessed is at least one of osteoarthritis, rheumatoid arthritis, or inflammatory arthritis, of the hand.
5. The method of claim 2 where analyzing the image further comprises correcting the image for distortions in the image of the hand created by the image capture device.
6. The method of claim 5 where the distortion is at least one of spatial distortion or color balance distortion.
7. The method of claim 5 where the distortion is perspective distortion caused by the angle of the image capture device relative to the hand and background.
8. The method of claim 6 where the spatial distortion is corrected by measuring the shape of the known background in the image, comparing it to the known values for the background, and adjusting the image to correct the spatial distortion.
9. The method of claim 6 where the color balance distortion is corrected by measuring color of the known background in the image, comparing it to the known values for the background, and adjusting the image to correct the color distortion.
10. The method of claim 2 where analyzing the image further comprises segmenting the digital image data representing the hand from the background image data.
11. The method of claim 10 where the method of image segmentation further comprises conversion of the image from a red, green, blue color image to a luminance-chrominance color image.
12. The method of claim 11 where the method of image segmentation further comprises using K-means clustering of the chrominance information of the image to segment the hand pixels from the background.
13. The method of claim 10 where the segmented image of the hand is a binary image.
14. The method of claim 13 where the boundary pixels of the binary image of the hand are determined and are recorded as a sequential array of Cartesian coordinates that trace the boundary of the hand.
15. The method of claim 14 where the boundary pixels are further analyzed to determine the location of anatomical fiduciary points in the image.
16. The method of claim 15 where the fiduciary points are the location of the tips of the fingers in the image.
17. The method of claim 16 where the fiduciary points are the location of the base of the fingers in the image.
18. The method of claim 14 where the boundary pixels are further analyzed to determine the width of the fingers in the image.
19. The method of claim 14 where the boundary pixels are further analyzed to determine the centerline of the fingers in the image.
20. The method of claim 19 where the centerline of the fingers in the hand are analyzed to determine the amount of angular deviation at the joints of the hand.
21. A system for assessment and monitoring of joint abnormalities in a subject caused by disease or injury, wherein said system captures and analyzes an image of an affected anatomical region of said subject where one or more joints are located, said system comprising:
- (a) a mobile device comprising a camera for capturing a digital image of said affected anatomical region and a reference object of known dimensions;
- (b) a mobile application executable by said mobile device and configured to collect sensor data relating to said digital image; and
- (c) processing software executable by a processor and configured to analyze said sensor data to determine anatomical measurements of said affected anatomical region relevant to said assessment and monitoring of joint abnormalities, said measurements comprising the dimensions of said one or more joints and the amount of angular deviation of anatomical structures at said one or more joints.
22. The system as defined in claim 21, wherein said subject is a human being, said affected anatomical region is a hand or foot of said subject, and said joint abnormalities are caused by arthritis.
23. The system as defined in claim 21, wherein said mobile device is a smartphone or tablet computer.
24. The system as defined in claim 23, wherein said mobile device has Internet connectivity and wherein said processor is a cloud processing server.
25. The system as defined in claim 21, wherein said sensor data comprises at least one of image data, camera data, ambient light sensor data, orientation sensor data, geographic location data, and time and date data.
26. The system as defined in claim 21, wherein at least one of said mobile application and said processing software is configured to determine whether image capture conditions of said digital image are acceptable for further processing of said digital image.
27. The system as defined in claim 21, wherein said mobile application is configured to provide an interface enabling a user of said mobile device to input biographical and environmental data relevant to said subject, wherein said biographical and environmental data comprises at least one of biographical information, experiential measures of pain and movement, medication use, weather and regional demographics.
28. The system as defined in claim 27, comprising electronic health record software executable on a cloud database server for storing and retrieving said anatomical measurements and said biographical and environmental data specific to said subject.
29. The system as defined in claim 21, wherein said processing software is configured to produce a report indicating the condition of said one or more joints based on said anatomical measurements.
30. The system as defined in claim 24, comprising a cloud database server operable to store and retrieve reference measurements, wherein said processing software is configured to compare said anatomical measurements to said reference measurements.
31. The system as defined in claim 24 wherein said processing software executable on said cloud processing server is configured to perform operations to determine said anatomical measurements, said operations comprising:
- (a) analyzing said sensor data to make adjustments to said digital image, wherein said adjustments comprise correcting a distortion of said image selected from the group consisting of a spatial distortion, a color balance distortion, and a perspective distortion;
- (b) segmenting a calibrated image of said affected anatomical region from a background of said digital image to provide a segmented image;
- (c) determining a boundary of said affected anatomical region and anatomical fiduciary points in said segmented image; and
- (d) calculating said anatomical measurements based on said boundary and said fiduciary points.
32. A method for assessment and monitoring of joint abnormalities in a subject caused by disease or injury, comprising:
- (a) capturing a digital image of an affected anatomical region of said subject where one or more joints are located and a reference object of known dimensions, wherein said image is captured using a mobile device comprising a camera;
- (b) collecting sensor data from said mobile device relating to said digital image; and
- (c) analyzing said sensor data to determine anatomical measurements of said affected anatomical region relevant to said assessment and monitoring of joint abnormalities, said measurements comprising the dimensions of said one or more joints and the amount of angular deviation of anatomical structures at said one or more joints.
33. The method of claim 32, wherein said subject is a human being, said affected anatomical region is a hand or foot of said subject, and wherein said joint abnormalities are caused by arthritis.
34. The method of claim 32, wherein said mobile device is a smartphone or tablet computer and wherein said method comprises transmitting said sensor data from said mobile device to a cloud processing server.
35. The method as defined in claim 32, wherein said sensor data comprises at least one of image data, camera data, ambient light sensor data, orientation sensor data, geographic location data, and time and date data.
36. The method as defined in claim 32, comprising determining, prior to analyzing said sensor data to determine said anatomical measurements, whether image capture conditions of said digital image are acceptable for further processing, and, if not, capturing a replacement digital image of said affected anatomical region and said reference object.
37. The method as defined in claim 32, comprising collecting biographical and environmental data relevant to said subject using said mobile device, wherein said biographical and environmental data comprises at least one of biographical information, experiential measures of pain and movement, medication use, weather and regional demographics.
38. The method as defined in claim 37, comprising providing a cloud database server for storing and retrieving an electronic health record specific to said subject comprising said anatomical measurements and said biographical and environmental data.
39. The method as defined in claim 32, comprising providing a report indicating the condition of said one or more joints based on said anatomical measurements.
40. The method as defined in claim 32, comprising comparing said anatomical measurements to reference measurements.
41. The method as defined in claim 40, wherein said reference measurements are prior measurements of said one or more joints of said subject, wherein said comparing enables monitoring of any changes to said one or more joints over time.
42. The method as defined in claim 37, comprising comparing said anatomical measurements and said biographical and environmental data to historical reference data, wherein said historical reference data comprises prior anatomical measurements and biographical and environmental data specific to said subject, wherein said comparing enables monitoring of any changes to said one or more joints relative to any changes to said biographical and environmental data over time.
43. The method as defined in claim 42, wherein said monitoring provides a measure of any correlations between said changes to said one or more joints and biographical or environmental factors, wherein said factors are selected from the group consisting of treatment received by said subject, medication received by said subject, physical activity of said subject, geographic location of said subject and weather conditions at said geographic location of said subject.
44. The method as defined in claim 32, wherein said analyzing comprises:
- (a) analyzing said sensor data to make adjustments to said digital image, wherein said adjustments comprise correcting a distortion of said image selected from the group consisting of a spatial distortion, a color balance distortion, and a perspective distortion;
- (b) segmenting a calibrated image of said affected anatomical region from a background of said digital image to provide a segmented image;
- (c) determining in said segmented image a boundary of said affected anatomical region and anatomical fiduciary points within said boundary; and
- (d) calculating said anatomical measurements based on said boundary and said fiduciary points.
45. The method as defined in claim 44, wherein said perspective distortion is corrected by determining the angle of said camera relative to said affected anatomical region and said reference object when said image is captured and adjusting said image to correct said perspective distortion.
46. The method as defined in claim 44, wherein said spatial distortion is corrected by measuring the size and/or shape of said reference object in said image to determine spatial measurements of said object, comparing said spatial measurements to known values for said object, and adjusting said image to correct said spatial distortion.
47. The method as defined in claim 44, wherein said color balance distortion is corrected by measuring the color of said reference object to determine color measurements, comparing said color measurements to known values for said object, and adjusting said image to correct said color distortion.
48. The method as defined in claim 44, wherein said segmenting comprises using K-means clustering of chrominance information for said image to segment pixels corresponding to said affected anatomical region from background image data.
49. The method of claim 32, wherein said reference object is a paper sheet of known dimensions and color.
50. The method of claim 36, wherein said determining whether said image capture conditions of said digital image are acceptable comprises analyzing said orientation sensor data to determine whether the orientation of said camera relative to said affected anatomical region is at a desirable angle.
51. The method of claim 36, wherein said determining whether image capture conditions of said digital image are acceptable comprises analyzing said ambient light sensor data to determine whether illumination of said affected anatomical region is suitable.
52. The method of claim 36, wherein said determining whether image capture conditions of said digital image are acceptable comprises determining if said reference object is within the field of view.
53. The method of claim 44, wherein said affected anatomical region is a hand and wherein said calculating said anatomical measurements comprises determining the length and location of segments of a finger extending between said one or more joints and determining the relative angle of said segments, wherein determining said relative angle provides a measure of joint breakdown causing deviation of said segments away from an anatomically normal orientation.
54. A system for assessment and monitoring of joint abnormalities in a subject caused by disease or injury, wherein said system captures and analyzes an image of an affected anatomical region of said subject where one or more joints are located, said system comprising:
- (a) a mobile device comprising a camera for capturing a single digital image of said affected anatomical region and a reference object of known dimensions;
- (b) a mobile application executable by said mobile device and configured to collect sensor data relating to said single digital image; and
- (c) processing software executable by a processor and configured to analyze said sensor data to determine anatomical measurements of said affected anatomical region relevant to said assessment and monitoring of joint abnormalities, said measurements comprising the dimensions of said one or more joints and the amount of angular deviation of anatomical structures at said one or more joints.
55. A system for assessment and monitoring of joint abnormalities in a subject caused by disease or injury, wherein said system captures and analyzes an image of an affected anatomical region of said subject where one or more joints are located, said system comprising:
- (a) a mobile device comprising a camera for capturing a digital image of said affected anatomical region and a reference object of known dimensions positioned independently of said affected anatomical region;
- (b) a mobile application executable by said mobile device and configured to collect sensor data relating to said digital image; and
- (c) processing software executable by a processor and configured to analyze said sensor data to determine anatomical measurements of said affected anatomical region relevant to said assessment and monitoring of joint abnormalities, said measurements comprising the dimensions of said one or more joints and the amount of angular deviation of anatomical structures at said one or more joints.
Type: Application
Filed: Dec 18, 2020
Publication Date: Jun 10, 2021
Inventors: Nicholas MacKinnon (Vancouver), Fartash Vasefi (Sherman Oaks, CA), Manuka Shanil Gunasekara (Vancouver)
Application Number: 17/127,023