Chatgpt does not replace health professionals. AII AI AI ARE ARE ARE ALLING SECTION – Computers use risks

It is essential to say that it is not AI, or that it is not a clinical judgment, medical experience or human contact. The diagnosis is an important part of the medical law, which includes not only objective data, but also the contextual commentary, an active auditory and empathy, “said Carlos Cortes Lusa.

The President said IA should be viewed as a clinical Decision Support tool and not as an autonomous diagnostic agent.

Carlos Cortes insists that AI tools are not eligible to make diagnoses “Lack of strong scientific evidence, tough certification mechanisms and algorithmic transparency above all, which allows doctors to understand and trust the decisions prescribed by doctors” ”.

Artificial Intelligence of the Order of Doctors Commission pointed out ”AI adoption in Medicine Shadam should only be done on the basis of scientific validation, with explanation, responsibility and continuous medical supervision. AI connection in clinical currents requires respect for moral principles such as autonomy, non -malphis and justice. “

The President stated in the wrong diagnosis cases “The consequences are very serious: starting enough treatments, intensifying clinical states, exposure to unnecessary or inappropriate treatments, escape suffering and loss of human life.”They do not take on moral responsibility.

In addition, the Lack of explanation in many algorithms makes it difficult to detect errors, obtain a boundary of responsibility and threaten confidence Sick in doctors and in medicine.

The officer also said There are no instances where AI use in Portugal causes relevant clinical consequencesThere is no confusion in the absence of cases.

“Requires surveillance, audit capacity, structural reporting systems and proactive regulation for the implementation of technologies that are interrupted”This is because there is no organization dedicated to AI oversight in the country’s health, which concludes the president, “clinical security, specifications of illness and integrity of medical law”.

Bastonary’s ads arrived after the Daman Spanish platform was warned Diagnosis for AI models from people who ask for websites and testimonials in social networks.

“I asked Chatgt to read my hand and he detected skin cancer” or “a woman with” Chatgpt Diagnostics Cancer “doctors said that doctors can broadcast on social networks a year before” transmitting on social networks “.

However, however, The results of AI chatbots are potential, training data may not be reliable and cannot follow a doctor who does a doctorRepresents the platform.

Source link

Related Articles

Back to top button