Thursday, April 29, 2010

"All Your 900 MHz Are Belong to Us"

If you were asked, “Does your organization use unencrypted wireless communications?”, what would your answer be? Responses may include ones such as “We don’t utilize wireless networks,” or “Our cell phones are our only wireless devices.” These answers may be somewhat true; however, many organizations may not have thought completely about their answer and assets. More specifically, the 900 MHz frequency range comes to mind. The 900 MHz frequency range is used by many common devices yet is often utilized in an unsecure manner for corporate use.

In short, the 900 MHz frequency range is an attacker’s playground. There is so much information that can be gleaned from playing in this space of which many folks are unaware. Two-way radios, simple wireless communication devices, and other items are more common than one might think that utilize this common and open communication channel. I’ll examine two cases in which SecureState engineers were able to obtain valuable information via trivial methods during both physical penetration tests and social engineering exercises.

First, SecureState was hired to perform work for a casino in the United States. Engineers were staying at a hotel approximately 3 or 4 miles away from the casino. From the hotel, a simple ham radio was used to listen to the 900 MHz frequency range and eavesdrop on the radio conversations of casino guards. From this, one could identify when guard shift changes occurred, when large sums of money were being transported, their origination, and destination as well. It doesn’t take a rocket scientist to explain why this is an issue. Other, more sophisticated attacks could be carried out using this information. With a sub $100 radio readily available at your neighborhood Radio Shack, the 900MHz frequency range may be capable of being used to listen in on your organization’s unencrypted communication.

Second, SecureState again fired up a ham radio to perform reconnaissance for a physical penetration test on a financial institution. Upon perusing the 900 MHz frequency range, it was identified that unencrypted wireless telephone headsets were being used in the helpdesk area. From this, SecureState was able to listen to password reset calls, and other issues being addressed at the target financial institution. There is no question why this is an issue, and this isn’t the end of it. Better yet, even after the phone call ends and the headset is put back in its cradle to charge when not in use, it acts as a bug in the office. The headset still transmits despite not being on a call. This means that all conversation in the helpdesk area, even while not on a telephone call, can be eavesdropped upon! Two solutions to this potential exposure are using the Plantronics CS55 and CS70 digital headset models. They both digitally encode and encrypt the audio and transmit it using TDMA technology. These headsets will provide sufficient protection against wireless headset eavesdropping. As best practice, it also is recommended that executives and executives’ assistants do not use wireless headsets for sensitive communications.

With those two simple case studies, it is clear that with less than $100 of readily accessible equipment, your organization may be vulnerable to such eavesdropping. Perhaps in your organization’s regular 802.11 wireless network enumeration looking for rogue access points, the 900 MHz frequency range should be included as well.

Read more!

Wednesday, April 28, 2010

Trouble in the Cloud

Our development team initially started using Microsoft Azure as our primary platform for hosting our external website. We signed up as a Community Technology Preview (CTP) member to receive the "Introductory Special," which gave us access to the content delivery network at no additional charge. Microsoft Azure guarantees that at least 99.95% of the time you will have connectivity and 99.9% of the time they will successfully process, add, update, read, and delete requests. Unfortunately, we have experienced at least 5 outages during a 3 month duration that we had NO control over. Due to these outages and many other issues we had while hosting with Microsoft Azure, our development team has decided to move our application from Microsoft Azure.

Let's talk about data backups. Well, there is not much to say, because you cannot back up your database or any of the content that is hosted with Microsoft Azure; however, you can take "snapshots" of a particular item in each of the containers that exist. In order to do this, we used a windows based client called Cloud Storage Studio by Cerebrata to manage our content: http://www.cerebrata.com/Products/CloudStorageStudio/Default.aspx. This product alleviated some of the qualms that we had with Microsoft Azure. Another alternative to data backups was simply creating a local copy of our SQL database and using SQL Compare by Redgate to synchronize our local SQL database with the SQL Azure tables in the cloud: http://www.red-gate.com/products/SQL_Compare/index.htm.

Pricing is another area to watch out for when using Microsoft Azure. Microsoft may lure you in with their "Introductory Special" offering free services and no monthly commitment. We experienced a high volume of outgoing and incoming requests early on and noticed how quickly the fees accumulated. In fact, our Azure costs nearly tripled by our third month. For more information about how the storage, data transfers, compute times, and transactions are measured, please read the Microsoft Azure pricing guide: http://www.microsoft.com/windowsazure/pricing/.

Uploading updates to the cloud has been a very painful process. Every time we uploaded an update to Microsoft Azure, it took anywhere from 15-25 minutes to process an 8-10MB package. This means 15-25 minutes of down time for our live application! That is not even the most frustrating part. Cache is not king when it comes to updating content on Microsoft Azure. Microsoft Azure utilizes dozens of servers across the world so you can have faster access to stored content. Unfortunately, if your application is cached across even a few of those servers it takes about 48-72 hours for the servers to update the cache. The caching option can be turned off; however, disabling this option may result in the loss of performance when accessing content. We experienced many issues with a simple change to a graphic not being reflected on the live application. We had to suspend or restart our live application, resulting in a loss of 15-20 minutes, just to see the change.

Overall, Microsoft Azure does have its advantages over a single server hosting solution. For instance, Microsoft Azure or any other cloud computing alternative might be the preferred platform when hosting a global application which processes data intensive transactions requiring bandwidth and computing power because it is supported globally.

In my opinion, Microsoft Azure is unstable and could be improved with the development of options such as remote access to the SQL Azure tables and a more efficient way to release application updates with less down time. After some thought and discussion over weaknesses we encountered while hosting with Microsoft Azure, our team has decided to move our application from Microsoft Azure onto a more stable, cost-effective single server hosting environment.


Read more!