Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. Agentic AI-based services are the new Shadow IT.

Agentic AI-based services are the new Shadow IT.

Scheduled Pinned Locked Moved Uncategorized
18 Posts 15 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • BrianKrebsB BrianKrebs

    Agentic AI-based services are the new Shadow IT. Change my mind.

    Allan ChowG This user is from outside of this forum
    Allan ChowG This user is from outside of this forum
    Allan Chow
    wrote last edited by
    #9

    @briankrebs let's be honest tho shadowit.ai sounds pretty bad ass

    Steffie πŸ³οΈβ€βš§οΈ :triple-moon-transfem: S 1 Reply Last reply
    0
    • BrianKrebsB BrianKrebs

      I'd argue that very few companies have any real appreciation for how many of their employees are already feeding API keys and other stuff into fairly new and questionable agentic AI tools or platforms. So many companies are like, oh we're taking a wait-and-see approach to adopting AI. Meanwhile, half their dev team is doing critical development work on shared servers that have no authentication or limited (no 2fa) auth.

      Dan SugalskiW This user is from outside of this forum
      Dan SugalskiW This user is from outside of this forum
      Dan Sugalski
      wrote last edited by
      #10

      @briankrebs I am also really curious how many people have aggressively violated various privacy laws by feeding stuff into various LLMs for "summary" and "analysis".

      Frankly it should be a much larger compliance nightmare than it is. (Or, I suppose, it *is* a ginormous compliance nightmare and just right now everyone's thinking it isn't. Incorrectly)

      George E. πŸ‡ΊπŸ‡Έβ™₯πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡ΈπŸ³οΈβ€πŸŒˆπŸ³οΈβ€βš§οΈG Michael MooreM 2 Replies Last reply
      0
      • BrianKrebsB BrianKrebs

        I'd argue that very few companies have any real appreciation for how many of their employees are already feeding API keys and other stuff into fairly new and questionable agentic AI tools or platforms. So many companies are like, oh we're taking a wait-and-see approach to adopting AI. Meanwhile, half their dev team is doing critical development work on shared servers that have no authentication or limited (no 2fa) auth.

        Thomas H JonesF This user is from outside of this forum
        Thomas H JonesF This user is from outside of this forum
        Thomas H Jones
        wrote last edited by
        #11

        @briankrebs@infosec.exchange

        On the plus side, step #1 of setting up things like an
        #AWS/#Azure/#GCP account β€” especially production ones β€” is to disable the ability to create IAM users (forcing the use of IAM-roles that are 2FA authenticated via a service like #Okta) …and the role-based authentication-tokens are typically TTLed to a couple hours.

        Still, a "good" (suspicious-quotes) agent-setup would be pretty trivial to configure to snarf credentials from the relevant token-services. That triviality likely applies more broadly.

        1 Reply Last reply
        0
        • BrianKrebsB BrianKrebs

          I'd argue that very few companies have any real appreciation for how many of their employees are already feeding API keys and other stuff into fairly new and questionable agentic AI tools or platforms. So many companies are like, oh we're taking a wait-and-see approach to adopting AI. Meanwhile, half their dev team is doing critical development work on shared servers that have no authentication or limited (no 2fa) auth.

          Mike ShewardS This user is from outside of this forum
          Mike ShewardS This user is from outside of this forum
          Mike Sheward
          wrote last edited by
          #12

          @briankrebs In several pen tests I've done across the last 18 months, one of the most interesting trends has been the sudden increase in the number of examples I've found of people who have thrown those API keys, and in some cases raw data, into accidentally public GitHub repos while attempting to glue AI to things to 'see what it can do'.

          Few weeks ago I found a GitHub repo that a developer had trained on a dump of their own corporate emails, and all those emails where just in public, on Github, and contained lots of things like vendor SFTP creds. It's a free for all.

          AI6YR BenA 1 Reply Last reply
          0
          • Mike ShewardS Mike Sheward

            @briankrebs In several pen tests I've done across the last 18 months, one of the most interesting trends has been the sudden increase in the number of examples I've found of people who have thrown those API keys, and in some cases raw data, into accidentally public GitHub repos while attempting to glue AI to things to 'see what it can do'.

            Few weeks ago I found a GitHub repo that a developer had trained on a dump of their own corporate emails, and all those emails where just in public, on Github, and contained lots of things like vendor SFTP creds. It's a free for all.

            AI6YR BenA This user is from outside of this forum
            AI6YR BenA This user is from outside of this forum
            AI6YR Ben
            wrote last edited by
            #13

            @SecureOwl @briankrebs Wheeee

            AI6YR BenA 1 Reply Last reply
            0
            • AI6YR BenA AI6YR Ben

              @SecureOwl @briankrebs Wheeee

              AI6YR BenA This user is from outside of this forum
              AI6YR BenA This user is from outside of this forum
              AI6YR Ben
              wrote last edited by
              #14

              @SecureOwl @briankrebs I will confess to playing random songs on a coworker's Alexa when they checked in their personal home Alexa key into a corporate git repository.

              1 Reply Last reply
              0
              • Dan SugalskiW Dan Sugalski

                @briankrebs I am also really curious how many people have aggressively violated various privacy laws by feeding stuff into various LLMs for "summary" and "analysis".

                Frankly it should be a much larger compliance nightmare than it is. (Or, I suppose, it *is* a ginormous compliance nightmare and just right now everyone's thinking it isn't. Incorrectly)

                George E. πŸ‡ΊπŸ‡Έβ™₯πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡ΈπŸ³οΈβ€πŸŒˆπŸ³οΈβ€βš§οΈG This user is from outside of this forum
                George E. πŸ‡ΊπŸ‡Έβ™₯πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡ΈπŸ³οΈβ€πŸŒˆπŸ³οΈβ€βš§οΈG This user is from outside of this forum
                George E. πŸ‡ΊπŸ‡Έβ™₯πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡ΈπŸ³οΈβ€πŸŒˆπŸ³οΈβ€βš§οΈ
                wrote last edited by
                #15

                @wordshaper@weatherishappening.network @briankrebs@infosec.exchange
                It took Enron to happen before we got Sarbanes-Oxley. Data privacy will have its Enron moment eventually.

                1 Reply Last reply
                0
                • Dan SugalskiW Dan Sugalski

                  @briankrebs I am also really curious how many people have aggressively violated various privacy laws by feeding stuff into various LLMs for "summary" and "analysis".

                  Frankly it should be a much larger compliance nightmare than it is. (Or, I suppose, it *is* a ginormous compliance nightmare and just right now everyone's thinking it isn't. Incorrectly)

                  Michael MooreM This user is from outside of this forum
                  Michael MooreM This user is from outside of this forum
                  Michael Moore
                  wrote last edited by
                  #16

                  @wordshaper @briankrebs Unfortunately, I don't think the people doing this care or will ever care. Privacy laws tend to be a joke anyways and there is very little incentive for most people/companies to change. I don't think most governments even want that to change. It's better for them, allows more data collection, etc.

                  I wish I didn't have such a negative and cynical outlook on it all.

                  1 Reply Last reply
                  0
                  • BrianKrebsB BrianKrebs

                    I'd argue that very few companies have any real appreciation for how many of their employees are already feeding API keys and other stuff into fairly new and questionable agentic AI tools or platforms. So many companies are like, oh we're taking a wait-and-see approach to adopting AI. Meanwhile, half their dev team is doing critical development work on shared servers that have no authentication or limited (no 2fa) auth.

                    K This user is from outside of this forum
                    K This user is from outside of this forum
                    kgndiue
                    wrote last edited by
                    #17

                    @briankrebs oh we dont even have 2fa because because. Have i mentioned we have a gigantic bloated mess of it bureaucracy but nobody cares we dont have a secure image repo?

                    But somebody had the idea to write safe dev guidelines because paper is what keeps us safe, not patching vulns.

                    1 Reply Last reply
                    0
                    • Allan ChowG Allan Chow

                      @briankrebs let's be honest tho shadowit.ai sounds pretty bad ass

                      Steffie πŸ³οΈβ€βš§οΈ :triple-moon-transfem: S This user is from outside of this forum
                      Steffie πŸ³οΈβ€βš§οΈ :triple-moon-transfem: S This user is from outside of this forum
                      Steffie πŸ³οΈβ€βš§οΈ :triple-moon-transfem:
                      wrote last edited by
                      #18

                      @grumpasaurus@infosec.exchange @briankrebs@infosec.exchange This is definitely what we all need: autonomous AI running IaC deployments. I mean, what could go wrong??

                      1 Reply Last reply
                      1
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Don't have an account? Register

                      • Login or register to search.
                      Powered by NodeBB Contributors
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups