The FCC introduced on Tuesday a plan to assist customers determine and block AI-generated robocalls. The plan, if handed, might influence a key a part of actual property brokers’ lead technology strategies.
At Inman Join Las Vegas, July 30-Aug. 1, 2024, the noise and misinformation will probably be banished, all of your huge questions will probably be answered, and new enterprise alternatives will probably be revealed. Be a part of us.
The Federal Communications Fee has plans to tighten the reigns on synthetic intelligence-generated robocalls.
FCC Chairwoman Jessica Rosenworcel introduced her plan on Tuesday, requiring callers to reveal the AI-generated robocalls when acquiring prior specific consent from customers. Even with prior specific consent, callers could be required to make one other disclosure on each AI-generated name they make, a measure Rosenworcel stated would assist customers “identify and avoid” calls that “contain an enhanced risk of fraud and other scams.”
The plan additionally requires creating tech that helps customers determine and block undesirable AI-generated calls and defending “positive uses” of AI-generated requires customers with disabilities.
Rosenworcel stated her proposal builds on a number of latest actions the FCC has taken to control robocalls, together with the passage of a declaratory ruling that stated voice cloning expertise is prohibited and a $6 million high-quality levied towards a New Hampshire man who made voice-cloned robocalls to sway 2024 main voting.
The plan will bear a three-part voting course of, beginning on the FCC’s August Open Assembly. If fee members approve it, it’ll face public remark and a last vote earlier than implementation.
Though the plan doesn’t point out any particular trade, it addresses a important element of many actual property brokers’ lead technology plans and rising tech that makes use of AI to automate chilly calls.
Final 12 months, Texas-based franchisor Keller Williams settled a $40 million class motion lawsuit for unsolicited, pre-recorded telemarketing calls its brokers made to customers with out their consent. The lawsuit leaned on the 1991 Phone Client Safety Act (TCPA), which Rosenworcel cited a number of instances in her announcement on Tuesday.
“Bad actors are already using AI technology in robocalls to mislead consumers and misinform the public,” she stated in a written assertion. “That’s why we want to put in place rules that empower consumers to avoid this junk and make informed decisions.”
In an electronic mail to Inman, advertising knowledgeable Katie Lance stated Rosenworcel’s proposal is a “significant development” that brokers and brokers shouldn’t ignore.
“For agents who rely on AI to streamline their marketing tasks, this move underscores the importance of ethical and compliant AI usage,” she stated. “AI has revolutionized our industry by enabling more personalized and efficient communication with clients; however, it’s crucial for agents to understand the boundaries of these tools to ensure they are not infringing on consumer privacy or regulatory standards.”
Lance stated AI have to be used responsibly, and that is the time for brokers to evaluate what AI instruments they’re utilizing and regulate how they’re utilizing them.
“For agents, this means being vigilant about the sources and methods of their AI tools, ensuring they comply with all relevant regulations, and focusing on building genuine connections with clients,” she stated. “AI should augment our efforts, not replace the personal touch that is so vital in real estate.”