Working…
ClinicalTrials.gov
ClinicalTrials.gov Menu

Machine Learning to Analyze Facial Imaging, Voice and Spoken Language for the Capture and Classification of Cancer/Tumor Pain

The safety and scientific validity of this study is the responsibility of the study sponsor and investigators. Listing a study does not mean it has been evaluated by the U.S. Federal Government. Read our disclaimer for details.
 
ClinicalTrials.gov Identifier: NCT04442425
Recruitment Status : Enrolling by invitation
First Posted : June 22, 2020
Last Update Posted : September 5, 2021
Sponsor:
Information provided by (Responsible Party):
National Institutes of Health Clinical Center (CC) ( National Cancer Institute (NCI) )

Brief Summary:

Background:

Cancer pain can have a very negative effect on people s daily lives. Researchers want to use machine learning to detect facial expressions and voice signals. They want to help people with cancer by creating a model to measure pain. They want the model to reflect diverse faces and facial expressions.

Objective:

To find out whether facial recognition technology can be used to classify pain in a diverse set of people with cancer. Also, to find out whether voice recognition technology can be used to assess pain.

Eligibility:

People ages 12 and older who are undergoing treatment for cancer

Design:

Participants will be screened with:

Cancer history

Information about their gender and skin type

Information about their access to a smart phone and wireless internet

Questions about their cancer pain

Participants will have check-ins at the clinic and at home. These will occur over about 3 months. They will have 2-4 check-ins at the clinic. They will check in at home about 3 times per week.

During check-ins, participants will answer questions and talk about their cancer pain. They will use a mobile phone or a computer with a camera and microphone to complete a questionnaire. They will record a video of themselves reading a 15-second passage of text and responding to a question.

During the clinic check-ins, professional lighting, video equipment, and cameras will be used for the recordings.

During remote check-ins, participants will be asked to complete the questionnaire and recordings alone. They should be in a quiet and bright room. The room should have a white wall or background.


Condition or disease
Cancer Neoplasms Solid Tumors

Detailed Description:

Background:

  • Pain related to cancer/tumors can be widespread, wield debilitating effects on daily life, and interfere with otherwise positive outcomes from targeted treatment.
  • The underpinnings of this study are chiefly motivated by the need to develop and validate objective methods for measuring pain using a model that is relevant in breadth and depth to a diversity of patient populations.
  • Inadequate assessment and management of cancer/tumor pain can lead to functional and psychological deterioration and negatively impact quality of life.
  • Research of objective measurement scales of pain based on automated detection of facial expression using machine learning is expanding but has been limited to certain demographic cohorts.
  • Machine learning models demonstrate poor performance when training sets lack adequate diversity of training data, including visibly different faces and facial expressions, which yields opportunity in the proposed study to lay a guiding foundation by constructing a more general and generalizable model based on faces of varying sex and skin phototypes.

Objectives:

-The primary objective of this study is to determine the feasibility of using facial recognition technology to classify cancer related pain in a demographically diverse set of participants with cancer/tumors who are participating on a clinical trial.

Eligibility:

  • Adults and children (12 years of age or older) with a diagnosis of a cancer or tumor who are on a clinical study for their underlying cancer/tumor.
  • Participant must have access to internet connected smart phone or computer with camera and microphone and must be willing to pay any charges from service provider/carrier associated with the use of the device

Design:

  • The design is a single institution, observational, non-intervention clinical study at the National Institutes of Health Clinical Center.
  • All participants will participate in the same activities in two different settings (remotely and in-clinic) for a three-month period.
  • At home, participants will utilize a mobile application for self-reporting of pain and will audio- visually record themselves reading a passage of text and describing how they feel. In the clinic, participants will perform the same activities with optimal lighting and videography, along with infrared video capture.
  • Visual (RGB) and infrared facial images, audio signal, self-reported pain and natural language verbalizations of participant feelings feel will be captured. Audio signal and video data will be annotated with self-reported pain and clinical data to create a supervised machine learning model that will learn to automatically detect pain.
  • Care will be taken with the study sample to include a diversity of genders and skin types (a proxy for racial diversity) to establish a broad applicability of the model in the clinical setting. Additionally, video recordings of participant natural language to describe their pain and how they feel will be transcribed and auto-processed against the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) library to explore the presence and progression of self-reporting of adverse events.

Layout table for study information
Study Type : Observational
Estimated Enrollment : 120 participants
Observational Model: Cohort
Time Perspective: Prospective
Official Title: A Feasibility Study Investigating the Use of Machine Learning to Analyze Facial Imaging, Voice and Spoken Language for the Capture and Classification of Cancer/Tumor Pain
Actual Study Start Date : October 27, 2020
Estimated Primary Completion Date : January 31, 2022
Estimated Study Completion Date : January 31, 2022

Group/Cohort
1DF/NoPain_IV-VI_Female
Worst pain in past month = 0; Skin Type IVVI, Female
1DM/NoPain_IV-VI_Male
Worst pain in past month = 0; Skin Type IVVI, Male
1LF/NoPain_I-III_Female
Worst pain in past month = 0; Skin Type I-III, Female
1LM/NoPain_I-III_Male
Worst pain in past month = 0; Skin Type I-III, Male
2DF/MildPain_IV-VI_Female
Worst pain in past month = 1-3; Skin Type IVVI, Female
2DM/MildPain_IV-VI_Male
Worst pain in past month = 1-3; Skin Type IVVI, Male
2LF/MildPain_I-III_Female
Worst pain in past month = 1-3; Skin Type IIII, Female
2LM/MildPain_I-III_Male
Worst pain in past month = 1-3; Skin Type IIII, Male
3DF/ModPain_IV-VI_Female
Worst pain in past month = 4-6; Skin Type IVVI, Female
3DM/ModPain_IV-VI_Male
Worst pain in past month = 4-6; Skin Type IVVI, Male
3LF/ModPain_I-III_Female
Worst pain in past month = 4-6; Skin Type IIII, Female
3LM/ModPain_I-III_Male
Worst pain in past month = 4-6; Skin Type IIII, Male
4DF/SeverePain_IV-VI_Female
Worst pain in past month = 7-10; Skin Type IVVI, Female
4DM/SeverePain_IV-VI_Male
Worst pain in past month = 7-10; Skin Type IVVI, Male
4LF/SeverePain_I-III_Female
Worst pain in past month = 7-10; Skin Type IIII, Female
4LM/SeverePain_I-III_Male
Worst pain in past month = 7-10; Skin Type IIII, Male



Primary Outcome Measures :
  1. Feasibility of using facial recognition technology to classify pain [ Time Frame: 3 months ]
    The primary objective of this study is to determine the feasibility of using facial recognition technology to classify pain in a demographically diverse set of patients with cancer who are participating on a clinical trial.


Secondary Outcome Measures :
  1. To determine the feasibility of using voice recognition technology [ Time Frame: 3 months ]
    Voice recognition technology

  2. To transcribe patient video responses to assess pain using free-text [ Time Frame: 3 months ]
    Video responses to assess pain using free-text

  3. To determine the feasibility of combining RGB and thermal images with voice recognition transcribed verbal responses [ Time Frame: 3 months ]
    RGB and thermal images

  4. To use natural language processing algorithms to assess pain [ Time Frame: 3 months ]
    Natural language processing algorithms to assess pain



Information from the National Library of Medicine

Choosing to participate in a study is an important personal decision. Talk with your doctor and family members or friends about deciding to join a study. To learn more about this study, you or your doctor may contact the study research staff using the contacts provided below. For general information, Learn About Clinical Studies.


Layout table for eligibility information
Ages Eligible for Study:   12 Years and older   (Child, Adult, Older Adult)
Sexes Eligible for Study:   All
Accepts Healthy Volunteers:   No
Sampling Method:   Non-Probability Sample
Study Population
Patients with histologically or cytologically proven advanced malignancies who are undergoing treatment for cancer.
Criteria
  • INCLUSION CRITERIA:
  • Ability of subject to understand and willingness to sign a written informed consent document.
  • Male or female subjects (including NIH staff) aged greater than or equal to 12 years.
  • Participants with diagnosis of a cancer or tumor
  • Participant must be on a protocol for their cancer/tumor at NIH
  • Must have access to a smart phone (iPhone or Android) with either a data plan and/or access to wireless internet (wifi) or a computer with a camera and microphone and access to internet and must be willing to use their device and assume any associated charges from service providers.

EXCLUSION CRITERIA:

  • Participants with brain or central nervous system (CNS) metastases. However, if a participant has completed curative intent radiotherapy or surgery and has remained asymptomatic for the prior three months, then he/she will be eligible to participate.
  • Participants with Parkinson s disease.
  • Known current alcohol or drug abuse.
  • Any psychiatric condition that would prohibit the understanding or rendering of informed consent.
  • Non-English speaking subjects.

Information from the National Library of Medicine

To learn more about this study, you or your doctor may contact the study research staff using the contact information provided by the sponsor.

Please refer to this study by its ClinicalTrials.gov identifier (NCT number): NCT04442425


Locations
Layout table for location information
United States, Maryland
National Institutes of Health Clinical Center
Bethesda, Maryland, United States, 20892
Sponsors and Collaborators
National Cancer Institute (NCI)
Investigators
Layout table for investigator information
Principal Investigator: James L Gulley, M.D. National Cancer Institute (NCI)
Additional Information:
Layout table for additonal information
Responsible Party: National Cancer Institute (NCI)
ClinicalTrials.gov Identifier: NCT04442425    
Other Study ID Numbers: 200130
20-C-0130
First Posted: June 22, 2020    Key Record Dates
Last Update Posted: September 5, 2021
Last Verified: August 31, 2021

Layout table for additional information
Studies a U.S. FDA-regulated Drug Product: No
Keywords provided by National Institutes of Health Clinical Center (CC) ( National Cancer Institute (NCI) ):
Telehealth
Self-Reported Pain
Facial Recognition Technology
Pain Score