Ask Runable forDesign-Driven General AI AgentTry Runable For Free
Runable
Back to Blog
Technology6 min read

Doctors, lawyers, and travel advisors are quietly getting offended when clients fact-check their expertise using AI chatbots | TechRadar

Consult AI, but keep it discreet or lose a vital professional relationship Discover insights about doctors, lawyers, and travel advisors are quietly getting off

TechnologyInnovationBest PracticesGuideTutorial
Doctors, lawyers, and travel advisors are quietly getting offended when clients fact-check their expertise using AI chatbots | TechRadar
Listen to Article
0:00
0:00
0:00

Doctors, lawyers, and travel advisors are quietly getting offended when clients fact-check their expertise using AI chatbots | Tech Radar

Overview

News, deals, reviews, guides and more on the newest computing gadgets

Start exploring exclusive deals, expert advice and more

Details

Unlock and manage exclusive Techradar member rewards.

Unlock instant access to exclusive member features.

Get full access to premium articles, exclusive features and a growing list of member rewards.

'It feels like a slap in the face' — never fact-check your doctor with Chat GPT as study shows doing so 'undermines' your relationship with human experts — and could even offend them by making them feel 'disrespected'

Consult AI, but keep it discreet or lose a vital professional relationship

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

Study finds professionals feel disrespected when clients compare their expertise with AI-generated answers

Advisors become less motivated after losing clients to AI-powered recommendations online

Clients using AI fact checks may appear less trustworthy to professionals afterward

A new study from Monash Business School has claimed professional advisors feel offended when clients use AI to get a second opinion on their recommendations.

The research, published in Computers in Human Behaviour, found professionals become less motivated to work with clients who consult AI tools.

This effect persists even when the client only uses AI for background information, or as a complementary resource rather than a replacement.

AI chatbots may be too validating for their own good

What we lose when AI starts doing all our thinking at work

“Advisors view AI as substantially inferior to themselves; thus, being placed in the same category as an AI system feels insulting and signals disrespect, undermining advisors' willingness to engage,” Associate Professor Gerri Spassova, the lead author, said.

Imagine spending an hour helping a client plan a complex trip, carefully mapping out flights, hotels, and itineraries — only for that client to take your recommendations and book everything through an AI chatbot instead.

Researchers found professionals who lost business to an AI were far less willing to work with that client again in the future.

Clients who consult AI may be seen as less competent and less warm by the advisors they approach for help.

When clients defer to AI, it prompts advisors to question the value of their own human contribution, and this may get worse as AI gets better.

Many advisors take offense at this, and it is the major reason why they pull back from clients who consult AI.

“One can only speculate,” Associate Professor Spassova said. “My intuition is that the situation will not get much better. Firstly, because professional advisors’ jobs are on the line.

Survey reveals top earners use AI less for ideas and more for catching mistakes

AI has 'predictable and systematic biases' when it comes to judging people

I asked a psychologist what worries the people trying to make AI safer

“Also, as AI gets better, it may threaten our sense of worth and self-regard, and so when clients defer to AI, it would prompt advisors to question the value of their human contribution.”

The study suggests for new client advisor relationships, people should not disclose that they consulted AI before the meeting.

A long history of working together might weaken the negative reaction, but even then, the advisor may still feel cheated.

This applies to doctors, lawyers, and other professionals whose expertise clients might fact-check with AI tools.

A doctor who spent years training does not want to be second-guessed by a patient who spent five minutes on Chat GPT.

AI tools usually give a general overview of a situation and are very likely to make mistakes.

Its judgment is highly dependent on the amount of information you supply, and if you are not detailed enough, its response can be misleading.

Also, AI gives responses to questions based on the way it is asked, and users can easily influence an AI tool to tell them what they want to hear.

Considering these nuances, it would be unfair to judge a professional with years of study and experience based on an uncertain tool.

There is absolutely no need to throw it in the face of a professional that you have consulted AI because it creates a sense of “lack of trust”.

Until professional norms adjust to the presence of AI, clients would be wise to keep their fact checking private or risk damaging professional relationships.

Follow Tech Radar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds.

Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master's and a Ph D in sciences, which provided him with a solid foundation in analytical thinking.

You must confirm your public display name before commenting

1 How to watch Barcelona vs Real Madrid: Live Streams & TV Channels for El Clasico 2025/26

2'People use smartphones more but invest less in their security': New report claims Mc Afee and Norton remain the most loved antivirus brands as users ditch lesser-known security products for free tools like Microsoft Defender or Apple Xprotect

3I’m a certified TV calibrator, and I tested the LG C6 and LG G5 OLEDs side-by-side to see which is better value — and I was surprised by the result

4 Hear me out: Mag Safe has failed to reach its potential, but the foldable i Phone Ultra could revive it — here’s how

5 As a fitness editor, I thought I knew what I liked in a gym shoe — but after using the Vivo Barefoot Motus Strength II for one month, I'm not sure I'm ever going back

Tech Radar is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.

© Future US, Inc. Full 7th Floor, 130 West 42nd Street, New York, NY 10036.

Key Takeaways

  • News, deals, reviews, guides and more on the newest computing gadgets
  • Start exploring exclusive deals, expert advice and more
  • Unlock and manage exclusive Techradar member rewards
  • Unlock instant access to exclusive member features
  • Get full access to premium articles, exclusive features and a growing list of member rewards

Cut Costs with Runable

Cost savings are based on average monthly price per user for each app.

Which apps do you use?

Apps to replace

ChatGPTChatGPT
$20 / month
LovableLovable
$25 / month
Gamma AIGamma AI
$25 / month
HiggsFieldHiggsField
$49 / month
Leonardo AILeonardo AI
$12 / month
TOTAL$131 / month

Runable price = $9 / month

Saves $122 / month

Runable can save upto $1464 per year compared to the non-enterprise price of your apps.