{"id":18223,"date":"2025-08-31T18:09:57","date_gmt":"2025-08-31T15:09:57","guid":{"rendered":"https:\/\/lamdabroking.com\/?p=18223"},"modified":"2025-08-31T18:10:56","modified_gmt":"2025-08-31T15:10:56","slug":"professional-liability-in-the-age-of-ai-advice","status":"publish","type":"post","link":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/","title":{"rendered":"Professional Liability in the Age of AI Advice"},"content":{"rendered":"<p dir=\"ltr\">This is the new frontier of <strong>professional liability<\/strong>. It\u2019s as if malpractice, a concept we associate with doctors or lawyers, now has a machine learning twist. The big question: in a claim stemming from faulty AI advice, <em>who is held responsible \u2013 the developer of the AI, the company deploying it, or the AI itself?<\/em> And how will insurance cover these scenarios?<\/p>\n<h2 dir=\"ltr\">When AI Gives Bad Advice: A New Kind of Malpractice<\/h2>\n<p dir=\"ltr\">Consider a few scenarios:<\/p>\n<ul dir=\"ltr\">\n<li>An AI financial advisor (let\u2019s say integrated into a banking app) tells a customer to invest their retirement fund in a high-risk portfolio not suited for them. The advice was generated by an algorithm finding patterns, but it turns out to be wholly inappropriate and the customer loses money.<\/li>\n<li>A virtual medical assistant app misinterprets symptoms and assures a user they have nothing serious, when in fact they needed urgent care. The delay causes the patient\u2019s condition to worsen.<\/li>\n<li>A legal advice chatbot drafts a contract clause for a small business, but the clause has a loophole that the business owner didn\u2019t catch. Later, that loophole is exploited in a dispute, costing the business dearly.<\/li>\n<\/ul>\n<p dir=\"ltr\">In each case, <strong>AI provided a service that would traditionally be given by a trained professional<\/strong> \u2013 a financial advisor, a doctor, a lawyer. When those professionals err, they (and their employers) can be sued for malpractice or negligence. They also typically carry professional liability insurance or malpractice insurance to cover such claims.<\/p>\n<p dir=\"ltr\">Now, when an AI makes the error, clients may suffer the same harm. They likely won\u2019t shrug and say \u201cOh well, it was just AI.\u201d They will seek accountability. They\u2019ll potentially sue the company that offered the AI service, arguing negligence in deploying or supervising the AI. They might even attempt to sue the makers of the AI technology. This is where liability gets complicated.<\/p>\n<h2 dir=\"ltr\">Who Bears Responsibility? Developer vs. Deployer vs. the AI<\/h2>\n<p dir=\"ltr\">Let\u2019s break down the potential parties:<\/p>\n<ul dir=\"ltr\">\n<li><strong>The Developer<\/strong>: This could be the company or team that built the AI model or software. For instance, the maker of the medical app, or OpenAI (for a GPT model), or any upstream tech provider. The question is, did the developer have a duty to the end-user? Often, software providers protect themselves with licensing agreements that disclaim liability for how the AI is used or the accuracy of its outputs. Unless the developer made specific promises (e.g., \u201cour AI is 99% accurate and safe for medical use\u201d \u2013 which they usually do not in contracts), it\u2019s hard to pin direct liability on them. Additionally, most AI providers position themselves as toolmakers. They often require the deploying company to <strong>accept responsibility<\/strong> for final use.<\/li>\n<li><strong>The Deployer\/Service Provider<\/strong>: This is likely the primary liable party in most cases. If a bank uses an AI to advise customers, the bank is offering the service. To the customer, it doesn\u2019t matter if a human or AI whispered the advice \u2013 it came from the bank\u2019s app. The bank has a duty of care to its customer in providing financial advice suitable to them. If that duty is breached via faulty AI advice, the bank can be held negligent. Similarly, if a hospital uses an AI to analyze radiology images and it misses a tumor, the patient can sue the hospital or doctor for malpractice; the hospital can\u2019t just point at the AI and evade responsibility. In legal terms, the AI is a tool, and professionals are expected to use tools appropriately.<\/li>\n<li><strong>The AI itself<\/strong>: Currently, an AI has no legal personhood (we discuss potential future personhood separately). You cannot sue \u201cDr. Algorithm\u201d or \u201cCounselor GPT\u201d in court as a defendant. There\u2019s no mechanism to serve papers to an AI or make it pay damages. So, practically, the AI agent bears no liability \u2013 it\u2019s those behind it. Perhaps one day, laws might allow some sui generis status for AI, but even then, any judgment would likely be paid out from an insurance or fund set up by, you guessed it, the humans who created or employed the AI. So the AI is, at best, an indirect cause, not a liable entity on its own.<\/li>\n<\/ul>\n<p dir=\"ltr\">Given that, the <strong>deployer<\/strong> (the company using AI in their service) is usually in the firing line. That company might then, if it loses money or faces a claim, try to recoup from the developer via indemnity clauses or lawsuits alleging a defective product. But those are upstream fights that may or may not succeed.<\/p>\n<h2 dir=\"ltr\">How Insurance is Adapting to AI-Driven Professional Services<\/h2>\n<p dir=\"ltr\">Now, how does insurance factor in? Let\u2019s consider a few types of coverage:<\/p>\n<ul dir=\"ltr\">\n<li><strong>Professional Liability Insurance (Errors &amp; Omissions)<\/strong>: Many businesses and professionals carry this to cover negligence in the services they provide. For example, law firms have malpractice insurance, financial advisors have E&amp;O insurance, etc. Traditionally, these policies assume a human professional is doing the work, possibly aided by software. Increasingly, insurers are clarifying that <strong>the use of AI does not void coverage<\/strong>. If a lawyer uses an AI tool to draft a brief and it inserts a terrible error, a well-crafted Lawyers Professional Liability policy should still cover the claim by the client (assuming no other exclusions). The key is whether using the AI was within the scope of providing professional services.<\/li>\n<\/ul>\n<p dir=\"ltr\">One potential wrinkle: If a firm hands off work entirely to an AI without oversight, could an insurer argue that isn\u2019t a \u201cprofessional service by a qualified professional\u201d and thus not covered? For instance, if a law firm let an AI give clients legal advice directly with no attorney review, an insurer might balk, claiming that the policy covers work performed by or under the supervision of a licensed attorney. This is a gray area. Insurers and insureds will likely negotiate terms \u2013 some policies might explicitly require human review for coverage, others might explicitly include autonomous AI advice as covered. We might see endorsements that say something like, \u201cCoverage is extended to claims arising from the use of artificial intelligence tools in rendering professional services, provided that the Insured has maintained oversight consistent with industry practices.\u201d<\/p>\n<ul dir=\"ltr\">\n<li><strong>Product Liability Insurance<\/strong>: If the deploying company argues \u201chey, the AI was a product we used and it was defective,\u201d they might look to the AI developer\u2019s product liability coverage. But most AI providers deliver software (often under license terms calling it not a \u201cproduct\u201d but a service, to further distance from product liability law). Product liability for software is still not well established in many jurisdictions. Unless the AI caused physical injury or property damage (which in advice cases, it usually doesn\u2019t \u2013 it causes pure financial loss or intangible harm), product liability coverage from the developer might not even apply. Also, many AI developers, especially big ones, will force users via contract to waive claims or limit their liability to trivial amounts.<\/li>\n<li><strong>Cyber Insurance<\/strong>: If the AI advice error stemmed from something like a glitch due to a cyberattack or a data issue, sometimes cyber insurance could come into play, but that\u2019s more tangential. Generally, an AI giving bad advice is not a cyber breach or system failure (it\u2019s performing as designed, just not with a desired outcome). Cyber policies probably won\u2019t cover pure \u201cbad advice\u201d scenarios.<\/li>\n<\/ul>\n<p dir=\"ltr\">Given that professional liability\/E&amp;O is the main line of defense, insurers are adjusting underwriting questionnaires and policy language:<\/p>\n<ul dir=\"ltr\">\n<li>Underwriters may ask, \u201cDo you use AI or automated tools in delivering your professional service? If so, in what capacity and what oversight is present?\u201d They want to gauge the risk. A firm that blindly relies on AI for critical decisions might be seen as higher risk than one that uses AI only for first drafts that humans always check.<\/li>\n<li>Some insurers worry about the \u201csilent AI exposure\u201d \u2013 meaning policies inadvertently covering AI-caused issues they didn\u2019t price for. For example, an insurer might not have thought that a $10 million policy for a law firm would be on the hook for an error made by a non-human actor. The scale of potential error could be larger if AI enables one professional to do far more work (hence more chances for error). Insurers might adjust premiums or require additional safeguards for heavy AI use.<\/li>\n<\/ul>\n<p dir=\"ltr\">On the flip side, there is an <strong>opportunity<\/strong>: offering <strong>AI malpractice insurance<\/strong> as a product. This could be marketed to:<\/p>\n<ul dir=\"ltr\">\n<li>Companies that create AI advisory systems (covering their liability if their AI causes clients harm, which could complement their product liability).<\/li>\n<li>Businesses deploying AI advisors (covering the unique aspects of AI errors, perhaps including things like the cost to fix an AI\u2019s mistake in addition to liability to third parties).<\/li>\n<\/ul>\n<p dir=\"ltr\">We haven\u2019t yet seen standalone \u201cAI malpractice\u201d policies widely advertised, but they could emerge as claims start happening.<\/p>\n<h2 dir=\"ltr\">Malpractice Meets Machine Learning: Real-world Precedents<\/h2>\n<p dir=\"ltr\">To date, fully autonomous AI advice giving is still in early phases, so we haven\u2019t seen a flood of litigation \u2013 but some harbingers:<\/p>\n<ul dir=\"ltr\">\n<li>In the legal field, there was the notorious case of a lawyer using ChatGPT to write a brief, which cited nonexistent cases. The lawyer faced court sanctions for it. That raised questions: if the client had been harmed, would the malpractice insurer cover a claim? Likely yes, but the lawyer clearly breached duty by not verifying AI output. It sets an example that <strong>AI is a tool, and professionals must validate its results.<\/strong> Failure to do so could be deemed negligence on the professional\u2019s part.<\/li>\n<li>In healthcare, if doctors start relying on AI diagnostic tools, malpractice law will likely treat the AI like a medical device or test result. The doctor is expected to use it wisely, not blindly. If an AI says \u201call clear\u201d but signs of illness were obvious, a doctor can\u2019t hide behind the AI \u2013 they\u2019d be liable for missing the diagnosis. Their med-mal insurance would cover it, and maybe the hospital might then try to sue the AI vendor if the tool was clearly faulty.<\/li>\n<li>Financial advisors using robo-advisors typically have humans overseeing. If a pure robo-advisory platform (with minimal human oversight) had a big mishap, affected customers could have a class action. The provider\u2019s E&amp;O insurance should in theory cover the claims (unless they had some exclusion for automated trading losses, which would be unusual to exclude if that\u2019s their business model).<\/li>\n<\/ul>\n<p dir=\"ltr\">In sum, early signs indicate the <strong>law will treat AI like any other tool \u2013 responsibility remains with the professional or firm deploying it.<\/strong><\/p>\n<h2 dir=\"ltr\">Who Pays? Developer Indemnities and Insurance Tower<\/h2>\n<p dir=\"ltr\">When a deploying company gets sued and pays out due to AI\u2019s bad advice, they might turn around and see if the developer had any indemnification obligations. Some AI vendors might offer indemnity for certain types of claims (for example, if the AI output infringes someone\u2019s copyright, a vendor might indemnify the business using it). But few if any will indemnify for \u201cAI gave wrong advice.\u201d They usually explicitly forbid use in certain high-risk scenarios in their terms or say \u201cnot responsible for any outcome; user assumes all risk.\u201d<\/p>\n<p dir=\"ltr\">Thus, the deploying company\u2019s insurance is the safety net. That company will have a <strong>tower of insurance<\/strong>: perhaps a primary E&amp;O policy and excess layers, maybe a cyber policy, and D&amp;O for shareholder suits. A big AI-related incident could trigger multiple layers:<\/p>\n<ul dir=\"ltr\">\n<li>The primary E&amp;O pays for the customer lawsuits.<\/li>\n<li>If customers are numerous, excess E&amp;O layers could kick in for larger total payouts or a class action settlement.<\/li>\n<li>If the company\u2019s stock price plunges due to a scandal from AI advice causing harm, shareholders might sue executives for mismanagement \u2013 triggering D&amp;O coverage separately.<\/li>\n<\/ul>\n<p dir=\"ltr\">Insurance companies are certainly gaming out these multi-layer scenarios as they develop new products and set premiums.<\/p>\n<h2 dir=\"ltr\">Risk Mitigation: Good Practices (often required by insurers)<\/h2>\n<p dir=\"ltr\">Insurance isn\u2019t the only piece; avoiding the loss in the first place is key. Here\u2019s where insurers might require or strongly incentivize:<\/p>\n<ul dir=\"ltr\">\n<li><strong>Human in the Loop:<\/strong> For now, best practice is that AI doesn\u2019t get the final say on high-stakes advice. A human professional should review AI-generated outputs, especially in medicine, finance, legal, engineering, etc. Insurers may ask about this. If a firm says \u201cNo, we let the AI handle everything,\u201d that could raise a red flag or lead to higher premiums.<\/li>\n<li><strong>Disclosure and Consent:<\/strong> Some professions are requiring disclosure if AI is used (e.g., some bar associations say lawyers should inform clients if AI was used in their case prep). If clients are informed upfront that advice is automated or AI-assisted, it might help legally (the client was aware of the nature of service). But a disclaimer \u201cthis is not professional advice, just AI\u201d may not fully protect a company if they, in effect, are in an advisory role. Still, having strong disclaimers in user agreements can limit liability to an extent (though consumer protection laws may limit how much one can waive).<\/li>\n<li><strong>Quality Assurance and Training:<\/strong> Companies should rigorously test AI systems before deploying. For instance, a fintech company might test the AI advisor against historical scenarios to see how it performs, tweaking it to avoid known pitfalls. Regular audits of AI decisions can catch issues early. Insurers love to hear about risk controls like these \u2013 it shows the company isn\u2019t just naively trusting an algorithm.<\/li>\n<li><strong>Updates and Monitoring:<\/strong> AI models can drift or become outdated. Ensuring the AI\u2019s knowledge is up-to-date (for example, a legal AI must know about the latest laws; a medical AI needs current research) is important. If an AI missed something because it wasn\u2019t updated, that could be seen as negligence in maintenance. Having a process for continuous improvement and error correction is crucial.<\/li>\n<\/ul>\n<p dir=\"ltr\">From an insurance payout perspective, if a company can show, \u201cWe followed industry best practices with our AI, but an unforeseeable error still occurred,\u201d an insurer would have a much harder time denying a claim. Conversely, if the company was reckless (e.g., using a general-purpose AI without validation in a critical role), the insurer might reserve rights or at least make the case that the company failed to mitigate known risks.<\/p>\n<p dir=\"ltr\"><strong>The Road Ahead: Clearer Contracts and Policies<\/strong><\/p>\n<p dir=\"ltr\">As AI advice becomes more common, we can expect clearer frameworks:<\/p>\n<ul dir=\"ltr\">\n<li><strong>Contracts between deployers and AI providers<\/strong> will evolve. Perhaps AI providers will offer \u201cwarranties\u201d or insurance-backed guarantees for certain use cases to make customers feel safer. For example, an AI vendor might include in its contract: \u201cIf our AI recommendation engine produces an error that leads directly to a defined financial loss, we will reimburse up to $X or cooperate with your insurer.\u201d These kinds of promises are not standard yet, but market pressure could create them, especially if one vendor does it as a competitive edge.<\/li>\n<li><strong>Industry Standards and Certifications:<\/strong> Professions might develop standards for AI use. A medical association might approve certain AI tools as fit for use under guidelines. Using certified AI might be looked upon favorably by insurers (similar to how using an FDA-approved medical device is expected, versus some unvetted tool).<\/li>\n<li><strong>Insurance Policy Evolution:<\/strong> Insurers might craft multi-faceted policies covering both the tech product and the professional use in one. For instance, a policy for a telehealth provider might cover malpractice whether the error comes from a doctor or the AI triage tool they use, in one package. This would prevent gaps and finger-pointing between different insurers of tech vs. professional service.<\/li>\n<\/ul>\n<p dir=\"ltr\">In conclusion, <strong>faulty AI advice is essentially the modern equivalent of professional error.<\/strong> Companies deploying these solutions should act as though they are responsible \u2013 because they are. Insurance will provide a backstop, but only if coverage is properly in place and the insured isn\u2019t grossly negligent in how they use AI. We\u2019re blending the realms of tech E&amp;O and professional liability, and the insurance industry is adapting with endorsements and new products to make sure that when machine learning meets malpractice, the victims can be made whole and the companies involved are protected from ruinous financial hits.<\/p>\n<p dir=\"ltr\">Who\u2019s responsible when GPT-Dr. Smith or GPT-Adviser Jones messes up? At the end of the day, the answer will almost always be: a human organization is. As one court put it succinctly, <strong>AI doesn\u2019t practice law or medicine \u2013 people do, using AI as a tool.<\/strong> Insurance and legal frameworks will reinforce that principle. As we navigate this, companies should ensure they have the right insurance coverage and risk controls, so they can innovate with AI in serving clients without inviting disaster. The promise of AI in professional fields is huge \u2013 increased efficiency, accessibility, and consistency. With careful oversight and robust insurance, we can enjoy those benefits knowing that if an AI makes a wrong call, the situation won\u2019t devolve into an uninsured, finger-pointing fiasco. Instead, there\u2019ll be a clear process: help the affected party, investigate what went wrong, and have the financial support (insurance) to handle the fallout.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat window \u2013 all powered by artificial intelligence. GPT-driven advisors and expert systems are increasingly capable of providing guidance that once only a human professional could. But what happens when that AI-powered advice is wrong, and someone gets hurt or suffers a loss as a result? <\/p>\n","protected":false},"author":9,"featured_media":18090,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[91],"tags":[],"class_list":["post-18223","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-insurance"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Professional Liability in the Age of AI Advice<\/title>\n<meta name=\"description\" content=\"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Professional Liability in the Age of AI Advice\" \/>\n<meta property=\"og:description\" content=\"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...\" \/>\n<meta property=\"og:url\" content=\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\" \/>\n<meta property=\"og:site_name\" content=\"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/lamda.ins\" \/>\n<meta property=\"article:published_time\" content=\"2025-08-31T15:09:57+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-08-31T15:10:56+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1600\" \/>\n\t<meta property=\"og:image:height\" content=\"900\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Oded Oded\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Oded Oded\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"15 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\"},\"author\":{\"name\":\"Oded Oded\",\"@id\":\"https:\/\/lamdabroking.com\/en\/#\/schema\/person\/a5b8f4894f9fd6a7a2f3742ba5688174\"},\"headline\":\"Professional Liability in the Age of AI Advice\",\"datePublished\":\"2025-08-31T15:09:57+00:00\",\"dateModified\":\"2025-08-31T15:10:56+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\"},\"wordCount\":2859,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/#organization\"},\"image\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg\",\"articleSection\":[\"AI Insurance\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\",\"url\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\",\"name\":\"Professional Liability in the Age of AI Advice\",\"isPartOf\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg\",\"datePublished\":\"2025-08-31T15:09:57+00:00\",\"dateModified\":\"2025-08-31T15:10:56+00:00\",\"description\":\"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...\",\"breadcrumb\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage\",\"url\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg\",\"contentUrl\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg\",\"width\":1600,\"height\":900,\"caption\":\"\u05d1\u05d9\u05d8\u05d5\u05d7 AI\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/lamdabroking.com\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Cyber insurance\",\"item\":\"https:\/\/lamdabroking.com\/en\/cyber-insurance\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Professional Liability in the Age of AI Advice\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/lamdabroking.com\/en\/#website\",\"url\":\"https:\/\/lamdabroking.com\/en\/\",\"name\":\"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance\",\"description\":\"\u05e0\u05d9\u05d4\u05d5\u05dc \u05e1\u05d9\u05db\u05d5\u05e0\u05d9\u05dd \u05d5\u05e4\u05d9\u05e0\u05e0\u05e1\u05d9\u05dd\",\"publisher\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/lamdabroking.com\/en\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/lamdabroking.com\/en\/#organization\",\"name\":\"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance\",\"url\":\"https:\/\/lamdabroking.com\/en\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lamdabroking.com\/en\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2022\/12\/lamdaLogo-2.svg\",\"contentUrl\":\"https:\/\/lamdabroking.com\/wp-content\/uploads\/2022\/12\/lamdaLogo-2.svg\",\"width\":237,\"height\":102,\"caption\":\"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance\"},\"image\":{\"@id\":\"https:\/\/lamdabroking.com\/en\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/lamda.ins\",\"https:\/\/www.linkedin.com\/company\/lamda-risk-and-capital-management\/\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/lamdabroking.com\/en\/#\/schema\/person\/a5b8f4894f9fd6a7a2f3742ba5688174\",\"name\":\"Oded Oded\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g\",\"caption\":\"Oded Oded\"},\"url\":\"https:\/\/lamdabroking.com\/en\/author\/oded\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Professional Liability in the Age of AI Advice","description":"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/","og_locale":"en_US","og_type":"article","og_title":"Professional Liability in the Age of AI Advice","og_description":"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...","og_url":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/","og_site_name":"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance","article_publisher":"https:\/\/www.facebook.com\/lamda.ins","article_published_time":"2025-08-31T15:09:57+00:00","article_modified_time":"2025-08-31T15:10:56+00:00","og_image":[{"width":1600,"height":900,"url":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg","type":"image\/jpeg"}],"author":"Oded Oded","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Oded Oded","Est. reading time":"15 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#article","isPartOf":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/"},"author":{"name":"Oded Oded","@id":"https:\/\/lamdabroking.com\/en\/#\/schema\/person\/a5b8f4894f9fd6a7a2f3742ba5688174"},"headline":"Professional Liability in the Age of AI Advice","datePublished":"2025-08-31T15:09:57+00:00","dateModified":"2025-08-31T15:10:56+00:00","mainEntityOfPage":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/"},"wordCount":2859,"commentCount":0,"publisher":{"@id":"https:\/\/lamdabroking.com\/en\/#organization"},"image":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage"},"thumbnailUrl":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg","articleSection":["AI Insurance"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/","url":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/","name":"Professional Liability in the Age of AI Advice","isPartOf":{"@id":"https:\/\/lamdabroking.com\/en\/#website"},"primaryImageOfPage":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage"},"image":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage"},"thumbnailUrl":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg","datePublished":"2025-08-31T15:09:57+00:00","dateModified":"2025-08-31T15:10:56+00:00","description":"We live in an age where you could receive financial planning from a bot, medical triage from an app, or legal information from a chat...","breadcrumb":{"@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#primaryimage","url":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg","contentUrl":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2025\/08\/AI-Insurance16.jpg","width":1600,"height":900,"caption":"\u05d1\u05d9\u05d8\u05d5\u05d7 AI"},{"@type":"BreadcrumbList","@id":"https:\/\/lamdabroking.com\/en\/professional-liability-in-the-age-of-ai-advice\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/lamdabroking.com\/en\/"},{"@type":"ListItem","position":2,"name":"Cyber insurance","item":"https:\/\/lamdabroking.com\/en\/cyber-insurance\/"},{"@type":"ListItem","position":3,"name":"Professional Liability in the Age of AI Advice"}]},{"@type":"WebSite","@id":"https:\/\/lamdabroking.com\/en\/#website","url":"https:\/\/lamdabroking.com\/en\/","name":"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance","description":"\u05e0\u05d9\u05d4\u05d5\u05dc \u05e1\u05d9\u05db\u05d5\u05e0\u05d9\u05dd \u05d5\u05e4\u05d9\u05e0\u05e0\u05e1\u05d9\u05dd","publisher":{"@id":"https:\/\/lamdabroking.com\/en\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/lamdabroking.com\/en\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/lamdabroking.com\/en\/#organization","name":"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance","url":"https:\/\/lamdabroking.com\/en\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lamdabroking.com\/en\/#\/schema\/logo\/image\/","url":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2022\/12\/lamdaLogo-2.svg","contentUrl":"https:\/\/lamdabroking.com\/wp-content\/uploads\/2022\/12\/lamdaLogo-2.svg","width":237,"height":102,"caption":"\u05d1\u05d9\u05d8\u05d5\u05d7 \u05d4\u05d9\u05d9\u05d8\u05e7 \u05d5\u05e1\u05d9\u05d9\u05d1\u05e8 - Lamda \u05dc\u05de\u05d3\u05d0 - High Tech Insurance"},"image":{"@id":"https:\/\/lamdabroking.com\/en\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/lamda.ins","https:\/\/www.linkedin.com\/company\/lamda-risk-and-capital-management\/"]},{"@type":"Person","@id":"https:\/\/lamdabroking.com\/en\/#\/schema\/person\/a5b8f4894f9fd6a7a2f3742ba5688174","name":"Oded Oded","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/b963c1df1f438ebca5af4999ce87b49df17e02ee8c0229a090b47e0993913bb1?s=96&d=mm&r=g","caption":"Oded Oded"},"url":"https:\/\/lamdabroking.com\/en\/author\/oded\/"}]}},"_links":{"self":[{"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/posts\/18223","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/comments?post=18223"}],"version-history":[{"count":0,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/posts\/18223\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/media\/18090"}],"wp:attachment":[{"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/media?parent=18223"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/categories?post=18223"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lamdabroking.com\/en\/wp-json\/wp\/v2\/tags?post=18223"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}