In today's world, we're surrounded by an ever-increasing amount of information and advice, and it can be overwhelming to decide which advice to follow. As a result, people are turning to new sources of guidance, including artificial intelligence (AI) platforms like Chat GPT. These platforms use advanced algorithms to provide tailored advice and suggestions based on a wide range of data inputs. While AI advice has many benefits, it raises questions about the role of humans in decision-making and the reliability of machine-generated guidance. In this blog post, we'll explore the pros and cons of AI advice compared to human advice and why AI platforms like Chat GPT are becoming so popular right now.
Ok, I have a secret - I didn’t write that last paragraph. Instead, I gave Chat GPT a prompt and asked it to write the paragraph for me. Could you tell? Here’s what I typed into the prompt:
Write an introductory paragraph to a blog post about artificial intelligence advice vs human advice and explain why artificial intelligence platforms like chat GPT are so popular right now
Pretty impressive, isn’t it? If you haven’t tried it, I highly recommend you check it out. Chat GPT and other AI tools like it are improving daily and will soon be integrated into many software programs we use regularly.
As with any new technology, many fear the coming age of AI and worry that robots will soon replace humans in many jobs. This is a valid fear, as nobody knows just how quickly AI will advance and what capabilities it will have in the near future. However, there are also many ways that AI could be used to make us better at our jobs.
We are in the business of providing advice. We help clients make important financial decisions, and when clients have a question, they come to us. You might think an AI chatbot would be a perfect replacement for much of what we do, but there are a few reasons why I believe human advice is still far superior to what you can find from AI.
The answer is often “it depends” - Many years ago, I wrote a blog post titled “It Depends,” arguing that those two words are the correct answer to most financial questions. There usually isn’t a right or wrong answer, especially when the question depends on what might happen in the future! I thought it would be fun to see how Chat GPT would handle some of these questions. Here are some prompts and the first portion of the response from Chat GPT, notice a theme?
My prompt: When should I take social security?
Chat GPT Answer: The decision of when to take Social Security benefits depends on your individual financial situation and goals. There are several factors you should consider when making this decision, such as your health, your expected lifespan, your other sources of retirement income, your financial needs, and your tax situation…
My prompt: How much do I need to retire?
Chat GPT Answer: The amount of money you need to retire depends on several factors, including your current age, the lifestyle you want to maintain during retirement, and the expected cost of living in your retirement location…
My prompt: What kind of asset allocation should I have?
Chat GPT Answer: The ideal asset allocation for you will depend on several factors, including your financial goals, investment horizon, risk tolerance, and current financial situation…
By now, I think you get the point. Chat GPT agrees that the answer to most financial questions is “it depends.” However, that’s a much better alternative than some of the other answers it provides, which leads to my next concern.
Everyone is wrong sometimes, even computers
It’s always better to admit when you don’t know the answer than to pretend you do and provide the wrong answer. Clients will often ask questions I can’t definitively answer on the spot, and I don’t hesitate to tell them that I’ll have to research that and get back to them. AI generally doesn’t do this, it tries to come up with an answer even if it doesn’t fully understand the question.
There are countless examples of people asking AI questions and getting back a detailed answer that is just plain wrong. Granted, AI will continue to improve, and in the near future, it might be incredibly difficult to stump Chat GPT. But will it ever admit that it doesn’t know an answer or that it might be wrong? This infallibility feels like a major flaw if you expect an AI to provide advice the way humans can.
We don’t know what’s behind the black box
We know that AI isn’t always right and that it won’t admit when it could possibly be wrong, but how do we know if it’s right? Chat GPT doesn’t provide sources for its answers and is basically a black box. If you search Google, it might take you longer to get to an answer, but at least you can find multiple sources, and if there’s more than one answer, you can compare them.
When I was starting out in my career at Grant Thornton, if I had to find the answer to a complicated tax question, I would have to dig into the tax code and IRS regulations. Blog posts wouldn’t cut it as a definitive source for a client. I remember a partner asking me for the code section where I found the answer. Luckily I had done my research and provided the code section, but then he responded with, “well did you read the next section and the exceptions to that rule?” Not only did I need to find the answer and cite my source, but I had to dig deeper to make sure there weren't other sections that might impact the answer.
Will we ever get to a point where we blindly accept that the answers provided by AI are correct? I think it’s possible, but I don’t think we are there yet. There are AI tools that provide citations and support, and for now, I think that’s a better approach.
Trust is built on relationships
We don’t provide financial advice to anyone until we get to know them and build a relationship with them. We generally go through a full financial planning engagement with all new clients so that we can learn as much as we can about them. It’s impossible to provide sound financial advice without understanding someone’s financial situation, personality, and background.
To provide valuable advice that will be acted upon, there also needs to be a certain level of trust in the relationship. We are always working to build that trust with our clients by following through on what we say we’ll do and providing high-quality service. I think it’s going to be difficult for many people ever to build that type of relationship with a machine.
Artificial Intelligence is incredible and only getting better. It’s fun to play with, and everyone should learn to use these tools. There are many limitations right now, but if they are used correctly they will be able to enhance our work and our lives in ways we can’t even imagine yet.
I will cheat again here and let Chat GPT finish this article for me…
While artificial intelligence like ChatGPT has made tremendous progress in the field of language processing and can generate insightful responses based on vast amounts of data, humans are still better at giving advice for several reasons. First, humans can offer personalized advice based on their own experiences and emotions, which AI lacks. Second, humans can interpret nonverbal cues, which are often crucial to understanding the situation and providing meaningful advice. Third, humans can empathize with others and provide emotional support, which is something that AI cannot replicate. Finally, humans can exercise judgment and make ethical decisions, which is a crucial aspect of giving good advice. Therefore, while AI can provide useful insights and recommendations, the unique qualities of human advice-giving make it still irreplaceable by artificial intelligence.
-Chris Benson, CPA, PFS
The views expressed represent the opinions of L.K. Benson & Company and are subject to change. These views are not intended as a forecast, a guarantee of future results, investment recommendation, or an offer to buy or sell any securities. The information provided is of a general nature and should not be construed as investment advice or to provide any investment, tax, financial or legal advice or service to any person. Please see Additional Disclosures for more information.