Skip to main content

RSA Day 3

(Posting this a day late as I was crazy exhausted yesterday after walking nearly ten miles! I literally laid down in the room at 22:30 and woke up at 04:30 still in my clothes, lights on, etc.... I think I was effectively conferenced out, and that was only Day 3!) 

 

Great tracks today and some exciting notes. Plus I got to hit the Expo floor. Here's the talks I made it to: 

 Teaching Software Engineers to Threat Model: We Did It, and So Can You - Jamie Dicken, New Relic 

Another Digital ID: Privacy-preserving Humanitarian Aid Distribution - Woulter Lueks, Faculty, CISPA Helmholtz, Center for Information Security

Web Application Hacking 101 - Look Mom No Tools - Joseph M. (I'm not going to name him as I have poor thoughts to share below.)


Lets break down the classes. Thought there was some great info today

Teaching Software Engineers to Threat Model: We Did It, and So Can You

Jamie did a great job of showing how a team of thoughtful and intentional engineers who are willing to partner with their stakeholders and audience can really push far left and deputize developer users to be security champions. 

Her team was facing the same challenge many of us face, lots of security reviews, few security reviewers, and an increasing backlog of potential risk. So they tried what many of us have probably tried, split up the security review into chunks, move some pieces earlier in the process. But, like any body of work that we split, we now have twice the number of pieces, but the same volume of work, and the same capacity.

So, if the volume of work is going to stay the same, we have to find a way to either cut down on the volume or increase the size of our funnel. I don't know about you, but headcount isn't in abundance these days, so a bigger funnel wasn't going to work. Unless, what if we could add people to the funnel. People who were experts in their domain and product. And who could be taught security, and threat modeling? 

Enter the Citizen Security Engineer, or Developers that we put in funny hats. 

By teaching developers to build their own threat models, we could not only limit the amount of threat models our security engineers have to build, but we could also have the fix agents find their own problems! This was a win-win, but how to get there? 


Here's the model as Jamie provided it.
1) Interview your Software Developers and get feedback about the current process and their concerns around the idea. E.g.; "What happens if we do a bad job at the threat model?" 

2) Interview your Security Engineers and get feedback about the current process and their concerns around putting the burden on the Software Engineers. E.g; "What happens if the Software Developers do a poor job?" 

3) Begin training development. 

    3a) Which Format works best? Computer Based Training (CBT)? Live Learning? Left-Seat-Right-Seat? 

    3b) Determine Methodology. Do we use STRIDE? DREAD? PASTA? Jamie's team decided to utilize STRIDE model

Example of STRIDE threat model [25] | Download Scientific Diagram

But what if someone is already familiar with a different model and wants to use that? That was ok, New Relic adopts the Golden Path model and allows that person to threat model on their own templates and diagrams as long as the final purpose is served.

3c) What tools to use? Team decided that paid tools and more complexity wasn't the way to go. So LucidChart, Visio, whatever they were using, let them keep using it. Low tech for now.

3d) When to do it? Do we do it based on feature? or User Story? I can tell you from my experience in infrastructure as opposed to dev, we do it when two systems want to talk to each other or when a new system is implemented. E.g.; There's a lot less risk involved giving Bob access to Alice's already implemented service, than there is if Bob's going to give Alice's service programmatic/API driven access to his service. 

4) Map the Software Development/Procurement/Architectural (whatever process corresponds to your implementation or release) process. And decide where you put the "security gate" of a threat model. They decided to put there's in the change design document because of the type of development they did. I've placed mine in the change and procurement procedures. 

5) Define the new workflow. Pretty simple, decide how the security gate is going to work. Where do they submit it, who reviews it, what feedback do they get, how do they get help?

6) Define the template. What info has to be there. If they don't follow the golden path, what's the minimum viable product that represents a thorough threat model? 

7) Pilot with a few teams. Pick out your best and worst candidates for this process. Find the super savvy almost tech guy and the new grads. Have them run the process through. Where does it break? Give them the training. Do a threat model. Give feedback. Fully support the pilot users asynchronously at all times. Time limit the pilot. 

7a?) Collect the feedback and adjust the pilot. Rerun if necessary. Some common points of feedback: Training Format/Content, Support, Process/Workflow, Overall NetPromoter Score (not a true NPS but some kind of rating systems)

8) Get to it. New Relic assigned training to target audience, gave a voluntary period of compliance (to break in teams and adjust processes), gave a mandatory date. 


What happened as a result? They saw developers and implementers start to identify and modify risky designs before security teams even got to them. This has been reflected in my own experiences with implementers more clearly understanding our security requirements and instead of building mitigations that modify the risk, they remove the risky behavior or design decision all together. 

Remember, your process won't be perfect, but it will be better than NO threat models.


Another Digital ID: Privacy-preserving Humanitarian Aid Distribution

Wouter Lueks had one of the most under-rated talks at RSA and probably one of the most prescient considering Russia/Ukraine, Israel/Hamas, and soon China/Taiwan and possibly N. Korea. Wouter is a post-doctoral researcher at CISPA Helmholtz Center for Information Security in Saarbrucken, Germany and has developed a system for tracking humanitarian aid registration and distribution while protecting vulnerable populations from the collection of sensitive biometric data. 

The more salient points, are that the system utilizes smart cards to record a biometric which stays on-device and is matched to a cryptographic key associated to that household or other unit. This key can be copied to all members of the household's card, so that all members or any member can collect aid at a time after distribution. More importantly, the card can also record entitlements (e.g.; bags of rice, amount of baby formula, etc) the bearer is entitled to AND record in the distribution system if a key has ever been presented, preventing double-dipping. 

I would highly recommend you view the paper by the same name, or seek out Wouter's videoed presentation. He was very practical about the limitations of his technology, which was appreciated. And he was a great interactive speaker who insured that the audience understand the difficulties, realities, and benefits to the system he was proposing without drowning us in cryptographic proofs. 

We even got to have a short conversation about using such a system for voting registration and anonymity in dangerous areas (think Afghan election where peoples hands were died purple which resulted in their targeting and death) and systems where a detrimental situations don't occur if technology breaks (like an established welfare/commodities distribution platform.)

I'm very excited to see where Wouter and his team take this work.

Web Application Hacking 101 - Look Mom No Tools

This was the only learning lab I had the opportunity to attend this week and one of the only actually disappointing ones. While I was frustrated with the configuration and execution of the class, I was introduced to the bWAPP: an extremely buggy web app which allows users to safely attempt a variety of web based attacks without any special tooling. 

My main criticism of this class was the multitude of spelling, grammatical, and logical errors that appeared in the documentation, making it extremely difficult to follow, accompanied by a failure to crawl, walk, run. The presenter had the opportunity to introduce HTTP, explain the HTTP methods, and then explain how one abuses those. Instead, I received the lab guide and was essentially told to read through it, and learn myself. So after around an hour, myself and many other members of the lab exited the room. 

Feeling disappointed, I set out to see the conference floor and was not disappointed. I got to stop in at a number of vendors and met some of my favorites, including really cool technologies like Tailscale, Thinkst Canary, and got to be interviewed by Panther.




Popular posts from this blog

LibWebP (CVE-2023-4863)

Here is a non-exhaustive list of possible mitigations to prevent the exploitation of CVE 2023-4863 in the LibWebP library. This library has a heap buffer overflow available across all operating systems, most browsers, an exceptional number of Electron framework applications. This CVE is rated a 10 after previously being rated 8.8. This was due to an original disclosure from Google stating that Chrome was the only effected application. After investigation, it was discovered that all instances of the LibWebP library were vulnerable across all platforms. A similar CVE ( 2023-5217 ) is pending analysis for the VP8 webstream video format (a sister library to libwep.) As working proof-of-concepts are generally available to the public and Google and Apple both acknowledge threat actors and spyware vendors making use of the vulnerability, it is essential that you begin reviewing and patching all business critical applications. Patch Browsers, All of them All major and minor browsers acr

Show And Tell

Once a week, our security team gathers everyone into a meeting and shares the last week’s worth of security related news and any new security initiatives. This one hour may be the most valuable meeting we attend and has the greatest impact on successful security outcomes. What is it? We call ours a Security Show & Tell. (You can call it whatever fun and exciting name fits your corporate culture.) Regardless of the name, the goal is to set aside an hour each week to share three kinds of security stories and our response to them. Stories that are in the news. Stories that impact our work. Stories that impact our lives. Author’s Note: There’s some helpful tips below on how to gather these stories.  Why you should do it There’s a lot of great reasons to do this, but I want to drive home a few really important ones. How many times has this happened to you? You wake up, open infosec.exchange , and begin scrolling only to find out that $Vendor has a nasty zero-day and organiza

Savory Dutch Babies

Ingredients: 1/4 Stick butter 1/2C AP flour 3/4C room temp milk 3 room temp eggs Salt pepper mace nutmeg allspice etc if you want it Blend it or whisk it until homogeneous  Put a castiron in a cold oven at 425°.  Remove when preheat finishes and melt in a 1\4 stick of butter.   Pour in batter.  Top with parm and fresh herbs.  Cook 15m.