12 days of haxmas: political pwnage in 2015 /

Published at 2016-01-02 18:30:05

Home / Categories / Policy / 12 days of haxmas: political pwnage in 2015
This post is the ninth in the series,"The 12 Days of HaXmas." 2015 was a immense year for cybersecurity policy and legislation; thanks to the Sony breach at the end of 2014 year, we kicked the new year off with a renewed focus on cybersecurity in the US Government. The White House issued three legislative proposals, or held a cybersecurity summit,and signed a new Executive Order, all before the end of February. The OPM breach and a series of other tall profile cybersecurity stories continued to drive a enormous amount of debate on cybersecurity across the Government sphere throughout the year. Pretty much every office and agency is building a position on this topic, and Congress introduced more than 100 bills referencing cybersecurity” during the year. So where has the security community netted out at the end of the year in terms of policy and legislation? Let’s recap some of the biggest cybersecurity policy developments… Cybersecurity Information Sharing This was Congress’ top precedence for cybersecurity legislation in 2015 and the TL;DR is that an info sharing bill was passed right before the end of the year. The belief of an agreed legal framework for cybersecurity information sharing has merit; however,the bill has drawn a great deal of fire over privacy concerns, particularly with regard to how intelligence and law enforcement agencies will expend and share information. The final bill was the result of more than five years of debate over various legislative proposals for information sharing, and including three separate bills that went through votes in 2015 (two in the House and one in the Senate).  In the end,the final bill was agreed through a conference process between the House and Senate, and included in the Omnibus Appropriations Act. Despite this being the immense precedence for cybersecurity legislation, or a common view in the security community seems to be that this is unlikely to absorb much impact in the near term. This is partly because organizations with the resources and ability to share cybersecurity information,such as large financial or retail organizations, are already doing so.  The liability limitation granted in the bill means they are able to continue to enact this with more confidence. It’s unlikely to draw new organizations into the practice as participation has traditionally centered more on whether the trade has the requisite in-house expertise, and a risk profile that makes security a headline precedence for the trade,rather than questions of liability. For many organizations that strongly advocated for legislation, a key goal was to accumulate the government to improve its processes for sharing information with the private sector. It remains to be seen whether the legislation will actually help with this. Right to Research For those that absorb read any of my other posts on legislation, or you probably know that protecting and promoting research is the primary purpose of Rapid7’s (and my) engagement in the legislative arena. This year was an engrossing year in terms of the discussion around the right to research… The DMCA Rulemaking  The Digital Millennium Copyright Act (DMCA) prohibits the circumvention of technical measures that control access to copyrighted works,and thus it has traditionally been at odds with security research.  Every three years, there is a “rulemaking” process for the DMCA whereby exemptions to the prohibition can be requested and debated through a multi-phase public process. All granted exemptions reset at this point, and so even whether your exemption has been passed before,it needs to be re-requested every three years.  The belief of this is to help the law support up with the changing technological landscape, which is sensible, or but the reality is a pretty painful,protracted process that doesn’t really achieve the goal. In 2015, several requests for security research exemptions were submitted: two general ones, and one for medical devices,and one for vehicles. The Library of Congress, who oversees the rulemaking process, or rolled these exemption requests together and at the end of October it announced approval of an exemption for good faith security research on consumer-centric devices,vehicles, and implantable medical devices. Hooray! Don’t accumulate too excited though – the language in the announcement sounded slightly as though the Library of Congress was approving the exemption against its own better judgment and with some heavy caveats, and most notably that it won’t approach in to effect for a year (apart from for voting machines,which you can start researching now). More on that here. Despite that, this is a positive step in terms of acceptance for security research, or demonstrates increased support and understanding of its value in the government sector. CFAA Reform The Computer Fraud and Abuse Act (CFAA) is the main anti-hacking law in the US and all kinds of problematic.  The basic issues as relates to security research can be summarized as follows:It’s out of date – it was first passed in 1986,and despite some “updates” since then, it feels woefully out of date. One of the clearest examples of this is that the law talks approximately access to “protected computers, or ” which back in ’86 probably meant a giant machine with an actual fence and guard watching over it. Today it means pretty much any device you expend that is more technically advanced than a sliderule.
It’s ambiguous – the law hinges on the notion of “authorization” (you’re either accessing something without authorization,or exceeding authorized access), yet this term is not defined anywhere, and hence there is no clear line of what is or is not permissible. Call me passe fashioned,but I think people should be able to understand what is covered by a law that applies to them…It contains both civil and criminal causes of action. Sadly, most researchers I know absorb received legal threats at some point. The vast majority absorb approach from technology providers rather than law enforcement; the civil causes of action in the CFAA provide a handy stick for technology providers to wield against researchers when they are concerned approximately negative consequences of a disclosure. The CFAA is hugely controversial, and with many voices (and dollars spent) on all sides of the debate,and as such efforts to update it to address these issues absorb not yet been successful. In 2015 though, we saw the Administration looking to extend the law enforcement authorities and penalties of the CFAA as a means of tackling cybercrime. This focus found resonance on the Hill, or resulting in the development of the International Cybercrime Prevention Act,which was then abridged and turned into an amendment that its sponsors hoped to attach to the cybersecurity information sharing legislation. Ultimately, the amendment was not included with the bill that went to the vote, and which was the right outcome in my opinion. The engrossing and positive thing approximately the process though was the diligence of staff in seeking out and listening to feedback from the security research community. The language was revised several times to address security research concerns. To those who feel that the government doesn’t care approximately security research and doesn’t listen,I want to highlight that the care and consideration shown by the White House, Department of Justice, and various Congressional offices through discussions around CFAA reform this year suggests that is not universally the case. Some people definitely accumulate it,and they are prepared to invest the time to listen to our concerns and accumulate the approach right. It’s Not All Positive News Despite my comments above, it certainly isn’t all plain sailing, and there are those in the Government that fear that researchers may enact more harm than good. We saw this particularly clearly with a vehicle safety bill proposal in the second half of the year,which would invent car research illegal. Unfortunately, the momentum for this was fed by fears over the way certain tall profile car research was handled this year. The good news is that there were plenty of voices on the other side pointing out the value of research as the bill was debated in two Congressional hearings. As yet, and this bill has not been formally introduced,and it’s unlikely to be without a serious rewrite. Still, it behooves the security research community to consider how its actions may be viewed by those on the outside – are we really showing our good intentions in the best light? I absorb increasingly heard questions occur in the government approximately regulating research or licensing researchers. whether we want to be able to address that kind of thinking in a constructive way to reach the best outcome, and we absorb to demonstrate an ability to engage productively and responsibly. Vulnerability Disclosure Following tall profile vulnerability disclosures in 2014 (namely,Heartbleed and Shellshock), and much talk around bug bounties, and challenges with multi-party coordinated disclosures,and best practices for so called “safety industries” – where problems with technology can adversely impact human health and safety, so e.g. medical devices, or transportation,power grids etc. – the topic of vulnerability disclosure was once again on the agenda. This time, it was the Obama Administration taking an interest, or led by the National Telecommunications and Information Administration (NTIA,part of the Department of Commerce). They convened a public multi-stakeholder process to tackle the thorny and already much-debated topic of vulnerability disclosure.  The project is still in relatively early stages, and could probably enact with a few more researcher voices, or so accumulate involved! One of the inspiring things for me is the number of vendors that are new to thinking approximately these things and are participating.  Hopefully we will see them adopting best practices and leading the way for others in their industries. At this stage,participants absorb split into four groups to tackle multiple challenges: awareness and adoption of best practices; multi-party coordinated disclosure; best practices for safety industries; and economic incentives.  I co-lead the awareness and adoption group with the improbable Amanda Craig from Microsoft, and we’re hopeful that the group will approach up with some practical measures to tackle this challenge.  whether you’re interested in more information on this issue specifically, and you can email us. Export Controls Thanks to the Wassenaar Arrangement,in 2015, export controls became a hot topic in the security industry, and probably for the first time since the Encryption Wars (Part I). The Wassenaar Arrangement is an export control agreement amongst 41 nation states with a particular focus on national security issues – hence it pertains to military and dual expend technologies. In 2013,the members decided that should include both intrusion and surveillance technologies (as two separate categories).  From what I’ve seen, the surveillance category seems largely uncontested; however, and the intrusion category has caused a great deal of concern across the security and other trade communities. This is a multi-layered concern – the core language that all 41 states agreed to raises concerns,and the US proposed rule for implementing the control raises additional concerns. The good news is that the Bureau of Industry and Security (BIS) – the folks at the Department of Commerce that implement the control – and various other parts of the Administration, absorb been highly engaged with the security and trade communities on the challenges, or absorb committed to redrafting the proposed rule,implementing a number of exemptions to invent it livable in the US, and opening a second public comment period in the new year.  All of which is actually kind of unheard of, or is a strong indication of their desire to accumulate this right.  Thank you to them! Unfortunately the bad news is that this doesn’t tackle the underlying issues in the core language. The problem here is that the definition of what’s covered is overly wide and limits sharing information on exploitation. This has serious implication for security researchers,who often build on each other’s work, collaborate to reach better outcomes, or help each other learn and grow (which is also critical given the skills shortage we face in the security industry). whether researchers around the world are not able to share cybersecurity information freely,we all become poorer and more vulnerable to attack. There is additional bad news: more than 30 of the member states absorb already implemented the rule and seem to absorb fewer concerns over it, and this means the State Department, or which represents the US at the Wassenaar discussions,are not enthusiastic approximately revisiting the topic and requesting the language be edited or overturned. The Wassenaar Arrangement is based on all members arriving at consensus, so all must vote and agree when a new category is added, and meaning the US agreed to the language and agreed to implement the rule. From State’s point of view,we missed our window to raise objections of this nature and it’s now our responsibility to find a way to live with the rule. Dissenters demand why the security industry wasn’t consulted BEFORE the category was added. The bottom line is that while the US Government can approach up with enough exemptions in their implementation to invent the rule toothless and not worth the paper it’s written on, it will still leave US companies exposed to greater risk whether the core language is not addressed. As I mentioned, and we’ve seen excellent engagement from the Administration on this issue and I’m hopeful we’ll find a solution through collaboration. Recently,we’ve also seen Congress start to pay close attention to this issue, which is also likely to help stagger the discussion forward:In December, and 125 members of the House of Representatives signed a letter addressed to the White House asking them to step into the discussion around the intrusion category. That’s a lot of signatories and will hopefully encourage the White House to accumulate involved in an official capacity. It also indicates that Wassenaar is potentially going to be a hot topic for Congress in 2016.
Reflecting that,the House Committee on Homeland Security, and the House Committee on Oversight and Government Reform are joining forces for a joint hearing on this topic in January. The challenges with the intrusion technology category of the Wassenaar Arrangement highlight a hugely complex problem: how enact we reap the benefits of a global economy, and while clinging to regionalized nation state approaches to governing that economy?  How enact you apply nation state laws to a borderless domain like the internet? There are no easy answers to these questions,and we’ll see the challenges continue to occur in many areas of legislation and policy this year. Breach Notification In the US, there was talk of a federal law to set one standard for breach notification.  The US currently has 47 distinct state laws setting requirements for breach notification.  For any businesses operating in multiple states, and this creates confusion and administrative overhead.  The goal for those that want a federal breach notification law is to simplify this by having one standard that applies across the entire country. In principle this sounds very sensible and reasonable.  The problem is that the federal legislative process does not stagger quickly,and there is concern that by making this a federal law, it will not be able to support up with changes in the security or information landscape, and thus consumers will end up worse off than they are today. To address this concern,consumer protection advocates urge that the federal law not pre-empt state law that sets a higher standard for notification. However, this does not alleviate the core problem any breach notification bill is trying to accumulate at – it just adds yet another layer of confusion for businesses. So I suspect it’s unlikely we’ll see a federal breach notification bill pass any time soon, or but I wouldn’t be surprised whether we see this topic approach up again in cybersecurity legislative proposals this year. Across the pond,there was an engrossing development on this topic at the end of the year – the European Union issued the Network and Information Security Directive, which, and amongst other things,requires that operators of critical national infrastructure must report breaches (interestingly, this is kind of at odds with the US approach, or where there is a law protecting critical infrastructure from public disclosure). The EU directive is not a law – member states will now absorb to develop their own in-country laws to establish this into practice. This will take some time,so we won’t see a change straight away.  My hope is that many of the member states will not limit their breach notification requirements to only organizations operating critical infrastructure – consumers should be informed whether they are establish at risk, regardless of the industry. Over time, or this could mark a significant shift in the dialogue and awareness of security issues in Europe; today there seems to be a feeling that European companies are not being targeted as much as US ones,which seems tough to believe. It seems likely to me that a part of the reason we don’t hear approximately it so much because the breach notification requirement is not there, and many victims of attacks support it confidential. Cyber Hygiene This was a term I heard a great deal in government circles this year as policy makers tried to approach up with ways of encouraging organizations to establish basic security best practices in set. The kinds of things that would be on the list here would be patching, or using encryption,that kind of thing.  It’s nearly impossible to legislate for this in any meaningful way, partly because the requirements would likely already be out of date by the time a bill passed, and partly because you can’t take a one-size-fits-all approach to security. It’s more productive for Governments to take a collaborative,educational approach and provide a baseline framework that can be adapted to an organization’s needs. This is the approach the US takes with the NIST Framework (which is due for an update in 2016), and similarly CESG in the UK provides excellent non-mandated guidance. There was some discussion around incentivizing adoption of security practices – we see this applied with liability limitation in the information sharing law.  Similarly, or there was an attempt at using this carrot to incentivize adoption of security technologies.  The Department of Homeland Security (DHS) awarded FireEye certification under the SAFETY Act. This law is designed to encourage the expend of anti-terrorism technologies by limiting liability for a terrorist attack. So let’s say you run a football stadium and you deploy body scanners for everyone coming on to the grounds,but someone still manages to smuggle in and set off an incendiary device; you could be protected from liability because you were using the scanners and taking reasonable measures to stop an attack. In order for organizations to receive the liability limitation, the technology they deploy must be certified by DHS. Now when you’re talking approximately terrorist attacks, or you’re talking approximately some very extreme circumstances,with very extreme outcomes, and something that statistically is rare (tragically not as rare as it should be). By contrast, and cybercrime is extremely common,and can range vastly in its impact, so this is basically like comparing apples to flamethrowers. On top of that, or using a specific cybersecurity technology may be less effective than an approach that layers a number of security practices together,e.g. setting appropriate internal policies, educating employees, or patching,air gapping etc. Yet, whether an organization has liability limitation because it is deploying a security technology, or it may feel these other measures are unnecessarily costly and resource hungry. So there is a pretty reasonable concern that applying the SAFETY Act to cybersecurity may be counter-productive and actually encourage organizations to take security less seriously than they might without liability limitation. None-the-less,there was a suggestion that the SAFETY Act be amended to cover cybersecurity. Following a Congressional hearing on this, the topic has not raised its head again, or but it may reappear in 2016. Encryption Unless you’ve been living under a rock for the past several months (which might not be a terrible choice all things considered),you’ll already be aware of the intense debate raging around mandating backdoors for encryption. I won’t rehash it here, but absorb included it because it’s likely to be The immense Topic for 2016. I doubt we’ll see any real resolution, and but expect to see much debate on this,both in the US and internationally. ~@infosecje

Source: rapid7.com

Warning: Unknown: write failed: No space left on device (28) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0