Fine-Tuning Generative Pre-Trained Transformers for Clinical Dialogue Summarization

Research output: Chapter in Book/Report/Conference proceedingsChapterpeer-review

Abstract

Automated clinical dialogue summarization can help make health professional workflows more efficient. With the advent of large language models, machine learning can be used to provide accurate and efficient summarization tools. Generative Pre-Trained Transformers (GPT) have shown huge promise in this area. While larger GPT models, such as GPT-4, have been used, these models pose their own problems in terms of precision and expense. Fine-tuning smaller models can lead to more accurate results with less computational expense. In this paper, we fine-tune a GPT-3.5 model to summarize clinical dialogue. We use both default hyperparameters along with manual hyperparameters for comparison purposes. We also compare our default model to past work using ROUGE-1, ROUGE-2, ROUGE-L, and BERTScores. We find our model outperforms GPT-4 across all measures. As our fine-tuning process is based on the smaller GPT-3.5 model, we show that fine-tuning leads to more accurate and less expensive results. Informal human observation also reveals our notes to be of acceptable quality.

Original languageEnglish
Title of host publication2024 International Conference on Frontiers of Information Technology, FIT 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798331510503
DOIs
Publication statusPublished - 2024
Event2024 International Conference on Frontiers of Information Technology, FIT 2024 - Islamabad, Pakistan
Duration: 9 Dec 202410 Dec 2024

Publication series

Name2024 International Conference on Frontiers of Information Technology, FIT 2024

Conference

Conference2024 International Conference on Frontiers of Information Technology, FIT 2024
Country/TerritoryPakistan
CityIslamabad
Period9/12/2410/12/24

Keywords

  • Data augmentation
  • Machine translation
  • Parameter tuning
  • Synthetic data generation
  • Transformers

Fingerprint

Dive into the research topics of 'Fine-Tuning Generative Pre-Trained Transformers for Clinical Dialogue Summarization'. Together they form a unique fingerprint.

Cite this