NEWS AND UPDATES
Experts tackle emerging ethical issues in health research
Experts discussed emerging issues concerning data sharing, artificial intelligence (AI), and bioethics during the 17th Philippine National Health Research System (PNHRS) Week last 16 August 2024, at the Almont Inland Resort and Hotel, Butuan City.
The talks during the third plenary session of the 17th PNHRS Week entitled, “Fostering Adaptive Ethical Dimensions of Health Research in Global Partnerships,” provided insights on how to address ethical issues brought about by modern technologies and further advancement of health innovations.
Prof. Peter Sy, Associate Professor from the University of the Philippines Diliman, highlighted the complexities of health data sharing. He states the current dilemma of balancing data sharing and its ethical considerations, as well as regulations that should be followed. According to Prof. Sy, there are key areas that the research community should develop to improve data sharing. First, he cited the importance of developing a standardized system for data collection. This would answer the problem of rigorous data cleaning and would improve data quality.
Varying data sharing regulations and protection laws across countries is also a significant challenge that should be addressed. To address these challenges, Prof. Sy raised the need for collaborative models and partnerships bridging current legal and ethical gaps. This can be achieved through a community-based participatory research approach which ensures the involvement of the community, researchers, and other stakeholders in the research process.
The talk ended with a highlight on the importance of access to data. “Government money used for research, the default should be openーit should be made available to the public and to the research community,” said Prof. Sy. Citing a study from Procter & Gamble, he shared that open access to data enables innovation. At the same time, he acknowledged the ethical implications and the need to protect the rights of data subjects along with advocating for open access data.
Ethics in health research and AI was discussed by Prof. Michael Joseph Diño, the Director for Research Development and Innovation Center of Our Lady of Fatima University. According to Prof. Diño, the main challenges with the use of AI is accessibility of use, guiding policies and standards, and accuracy and security issues.
With these, Prof. Diño shared some approaches that can be applied to the use of AI in research. First, is to ensure the involvement of humans along with the use of AI. “Whenever we’re going to use AI with research, we need to balance technology with human judgment,” he said. He stressed that humans should be accountable for the research outputs produced with the use of AI.
To enable this, there is a need to develop policies and guidelines for AI in research. Prof. Diño proposed a framework that suggests to review prerequisites prior to the use of AI, identify the purpose and expected outcomes of the research, know existing standards for the use of AI, and express the use or non-use of AI.
Lastly, the session highlighted the need to discuss environmental health in bioethics. Dr. Pacifico Eric Calderon, an Associate Professor from the St. Luke’s College of Medicine and a Medical Specialist II at the National Children’s Hospital, raised the question, “How can we expand bioethics to include both human health and environmental well-being?”
Dr. Calderon suggested that bioethics should encompass environmental issues to highlight its relevance in healthcare. He stated that environmental crises significantly affect human health, citing the effects of technology, toxics, and consumption to health. Given this, he emphasized that medical education should integrate bioethics. “We are argue for the inclusion of environmental discussions in our shaping of our understanding of bioethics,” he said.
The plenary session closed with an open forum with the audience. To access the replay, visit the DOST-PCHRD Facebook page: https://www.facebook.com/dostpchrd/videos/1074092261019536.