Static Lists Are The Wrong Way to do Attack Surface Mapping

March 8, 2021

Post by Robert Hansen

This post is the first of a short series of posts that we’ve dubbed “Attack Surface Mapping The Wrong Way,” showing the wrong way that people/companies/vendors attempt to do attack surface mapping.  We begin with static lists and why they are the wrong way.

Static lists are flawed

When asked what the most common form of asset inventory or attack surface map is stored in, the most common response is “Excel.” It would probably be funny if it were not that Excel is not at all designed to be used in such a way, and, Excel is for all intents and purposes, static. Static lists are a wonderful place to get started or to be used as a tool to perform actions upon as a snapshot of the actual attack surface. But as an ongoing tool, it is insanely unwise. It is the modern equivalent of chiseling information into stone – it is just the wrong way to handle dynamic data.

There are a lot of legacy reasons that an inventory may have grown in an Excel spreadsheet. Spreadsheets are easily shared, easy to sort, and relatively straightforward to search through. That said, they are entirely unchanging unless someone does something to change them. That makes them entirely manual, and if someone is not constantly updating it with new information, it will quickly become a fading memory of how things used to be and increasingly less relevant to your existing environment.

The Internet Changes

The first reason static lists are flawed may seem obvious, but it is difficult to quantify: The Internet changes. Without knowing for sure how fast it changes or in what way it changes, all we can do is say that the one thing we do know is that environments can and do change over time. Assets that a company had in their inventory and were once interested in will suddenly be looked at through a new lens, and the organization will suddenly say, “these assets no longer belong to us,” or “we no longer care about these assets.” As we might expect, there are a variety of reasons why that we will explore.

Briefly, let us first appreciate that the attack surface map for a company is a living, breathing, and constantly evolving thing. The bigger the attack surface map, the more likely assets are to change — day to day and hour to hour. Hosts may be added or removed. Listening ports/services may be open and closed. Software may be added, removed, and updated. Clearly, there is no other way to keep the attack surface map up-to-date other than through fierce automation. Then, of course, if any changes are unknown or uncontrolled, we can safely refer to them as shadow assets, shadow services, and shadow software, respectively.

So, let us touch on some of the many reasons why an asset needs to be removed from an attack surface map:

  • Domain name(s) purposely or mistakenly expired
  • Hostname / asset was decommissioned or IP-filtered by a perimeter firewall
  • Listening port / service was disabled or IP-filtered by a perimeter firewall
  • Ownership of the asset was transfer to another company (i.e., M&A)
  • Management and responsibility of the asset was transferred to a third-party (i.e., vendors / cloud provider)
  • Asset is no longer important enough to pay attention to (i.e., campaign is over)
  • Someone registered the wrong domain name (i.e., keyword / typo)
  • Etc… etc…

As we can see, attack surface management must be more than just adding assets to an inventory, but smartly removing them as well. Otherwise, there will eventually be a lot of garbage data to contend with, and the map will be anything but useful and up to date.

The reasons they may be added versus removed are equally varied. Here are some examples:

  • The company may acquire another company
  • The company may acquire new software
  • The business may update software, or it may auto-update itself
  • The software may get hacked
  • Companies may get into a new line of business
  • Dev/QA teams may spin up new websites for testing or Network engineers may add new hardware for testing
  • Companies may work with 3rd parties who develop new assets for them
  • The marketing team may spin up promotional sites
  • Etc… etc…

As you can see, it is not wise to rely on static lists for really any amount longer than you absolutely must. The environment of an average company appears to change at a rate of approximately 1-5% year over year by our rough measure. That may not seem like much, but if you are talking about an enterprise with 20,000 assets, that is a change of up to 1,000 assets on average. That is an enormous amount of change to manage by hand, and as we have seen several times now, it often takes just one of those machines having an exploitable issue for an attacker to leverage.

That is why doing “discovery” has only minor utility to mature organizations, yet is often conflated with asset management. Think of discovery as a process that an asset management or attack surface management platform effectively operationalizes in a way that makes it continuous, as opposed to a singular snapshot in time. While discovery tools do have a small place in the penetration testing world, they provide next to no value when compared to a mature organization’s use of an up-to-date asset map.

0-Days do not wait for you

The next issue with static lists is that new exploits (also known as 0-days or zero-days) do not wait for you to update your inventory list. If a new exploit is introduced to the world, and attackers start leveraging it quickly, you do not want that to be the day for you to realize your static list is weeks or months out of date. The amount of time it takes to manually update a static list is enormous and simply does not scale when you are trying to combat a real threat that may be on your doorstep in days, or even hours or less.

It is seeming as if the future of successful attackers will be leveraging speed. And speed is the one thing you cannot have if you are operating on lists that have aged and must be manually re-vetted. This is one reason why Bit Discovery does not only present findings but rather also gives users direct access to the underlying data. So even if an 0day does not have an official CVE, if you know that some product is vulnerable, you can quickly identify that product/service and take corrective action without having a CVE assigned.

Adversary attitudes change quickly

Just like the Internet changes quickly, so do adversaries. Adversaries may act a certain way one day and act an entirely different way another. There are a lot of reasons for that, but they all boil down to a change in attacker attitudes or priorities. The adversarial community may not even know you exist one day, and suddenly you become public enemy number one based on some issue that popped up on social media. Or you may not be a target one day, and suddenly, by virtue of someone else’s demise, you may turn into the next most target-rich environment. Your knowledge of how adversaries think about and operate regarding your environment is very important.

However, if you have no idea what you own, how can you check for said activity? If an adversary is talking about “example.com,” but you don’t even know you own “example.com,” then how are you going to measure how adversaries are talking about it? How are you going to monitor for social signals, or hacking activity in underground channels, if you simply have no idea what assets you should be monitoring for?

The value of an asset can change quickly due to new features

One attribute of sites that seems to elude a lot of security experts is that the asset’s value is not a static thing. Often security people will focus on the fact that risks change, but that is mostly based on the vulnerabilities which may become known or might change in scope depending on other features of the site. But the site’s inherent value can also change over time.

The first way it can change is a decrease in value. This happens when:

  1. Features of the site are removed or pulled off the site: Think about when a company removes a credit card database and starts clearing credit cards with a third party. The value of the site to an attacker is wildly less if there is not anything in the database of value. That leaves the attacker with the much-diminished value of trying to read credit cards as they are entered, rather than all at once.
  2. The season of value is over: Lots of retail sites and landing pages are of extreme value during the Thanksgiving through Christmas time frame due to the holiday shoppers. Then you will tend to see a greatly diminished value of the site because the traffic is substantially less. For example, let us say the season is a one-off promotional deal that only lasts a few weeks. After those few weeks are up, no one will visit the site. We have seen this with landing pages associated with the Superbowl, for instance – a massive uptick on the first day and wildly diminished traffic in subsequent days. The value of the site literally changed with time, and no continuing promotions to drive traffic to the sites with no other factors.
  3. The site is deprecated: If the backlinks to the site no longer point to the site in favor of a new site, not only is the attacker less likely to find it, so are legitimate users, so it diminishes in risk and value at the same time.

Inversely, asset value can increase over time in much the same ways:

  1. The site becomes popular: When a site becomes popular, you can see a massive uptick in the number of users who convert into leads and that ultimately convert into customers. Or you may have the best deal on the widget, and people are looking for deals. Or you may be found out to have the last lot of a very rare item. All of that improves the value of the site.
  2. You launch new features: The new features of the site can be of enormous value, making it more useful to improve the valuation of the company. An app that is fully functional is almost always worth more than one that has missing features.
  3. You start storing sensitive information: The value to the business to protect the app increases dramatically if the asset in question suddenly becomes the conduit for or the actual placement of sensitive information. That can be trade secrets, application code, customer lists, or traditional PII/PHI.

As such, it is clear that with all these factors in mind, static lists are simply the wrong way to handle attack surface mapping. Yet, not only are they the most common way that companies manage their assets, but it is also widely used by vendors, contractors, and third parties. There will often be a “mechanical Turk” (a person behind the curtain) who is tasked with updating the list semi-regularly. That at least has the advantage of updating at all, but with some huge downsides of human error, version/revision control, etc. That is why automation wins – not just because it is up to date, but also because it is free of human laziness/incompetence.

Want to talk about the right way to do attack surface management? We’ll show you. Get in touch with us here.