Several big businesses have published source code that incorporates a software package previously hallucinated by generative AI.

Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI’s bad advice, we’ve learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.

    • residentmarchant@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      11 months ago

      I find if I write one or two tests on my own then tell Copilot to complete the rest of them it’s like 90% correct.

      Still not great but at least it saves me typing a bunch of otherwise boilerplate unit tests.

    • anlumo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      11 months ago

      It’s a matter of learning how to prompt it properly. It’s not a human and thus needs a different kind of instructions.