nerdculture.de is one of the many independent Mastodon servers you can use to participate in the fediverse.
Be excellent to each other, live humanism, no nazis, no hate speech. Not only for nerds, but the domain is somewhat cool. ;) No bots in general. Languages: DE, EN, FR, NL, ES, IT

Administered by:

Server stats:

1.2K
active users

#aws

41 posts27 participants1 post today

howdy, #hachyderm!

over the last week or so, we've been preparing to move hachy's #DNS zones from #AWS route 53 to bunny DNS.

since this could be a pretty scary thing -- going from one geo-DNS provider to another -- we want to make sure *before* we move that records are resolving in a reasonable way across the globe.

to help us to do this, we've started a small, lightweight tool that we can deploy to a provider like bunny's magic containers to quickly get DNS resolution info from multiple geographic regions quickly. we then write this data to a backend S3 bucket, at which point we can use a tool like #duckdb to analyze the results and find records we need to tweak to improve performance. all *before* we make the change.

then, after we've flipped the switch and while DNS is propagating -- :blobfoxscared: -- we can watch in real-time as different servers begin flipping over to the new provider.

we named the tool hachyboop and it's available publicly --> github.com/hachyderm/hachyboop

please keep in mind that it's early in the booper's life, and there's a lot we can do, including cleaning up my hacky code. :blobfoxlaughsweat:

attached is an example of a quick run across 17 regions for a few minutes. the data is spread across multiple files but duckdb makes it quite easy for us to query everything like it's one table.

If anybody out there is working on using #LLMs or #AI to analyze #security events in AWS, I wonder if you're considering bullshit attacks via event injection. Let me explain. I'm openly musing about something I don't know much about.

You might be tempted to pipe a lot of EventBridge events into some kind of AI that analyzes them looking for suspicious events. Or you might hook up to CloudWatch log streams and read log entries from, say, your lambda functions looking for suspicious errors and output.

LLMs are going to be terrible at validating message authenticity. If you have a lambda that is doing something totally innocuous, but you make it print() some JSON that looks just like a GuardDuty finding, that JSON will end up in the lambda function's CloudWatch log stream. Then if you're piping CloudWatch Logs into an LLM, I don't think it will be smart enough to say "wait a minute, why is JSON that looks like a GuardDuty finding being emitted by this lambda function on its stdout?"

You and I would say "that's really weird. That JSON shouldn't be here in this log stream. Let's go look at what that lambda function is doing and why it's doing that." (Oh, it's Paco and he's just fucking with me) I think an LLM is far more likey to react "Holy shit! there's a really terrible GuardDuty finding! Light up the pagers! Red Alert!"

Having said this, I'm not doing this myself. I don't have any of my #AWS logging streaming into any kind of #AI. So maybe it's better than I think it is. But LLMs are notoriously bad at ignoring anything in their input stream. They tend to take it all at face value and treat it all as legit.

You might even try this with your #SIEM . Is it smart enough to ignore things that show up in the wrong context? Could you emit the JSON of an AWS security event in, say, a Windows Server Event Log that goes to your SIEM? Would it react as if that was a legit event? If you don't even use AWS, wouldn't it be funny if your SIEM responds to this JSON as if it was a big deal?

I'm just pondering this, and I'll credit the source: I'm evaluating an internal bedrock-based threat modelling tool and it spit out the phrase "EventBridge Event Injection." I thought "oh shit that's a whole class of issues I haven't thought about."

It is amazing how bad the search function on the #aws documentation has become.

Search for a term, get a bunch of hits, which at first glance look right.

Then you open them and actually start looking for the term, nowhere to be found.

I'm guessing they're incorporating "AI"?