Brazilian Blackout – Our comments on "The true about blackout post"
Much has been spoken last week about reasons for the great blackout occurred on November 10th, 2009. The Brazilian government has reported atmospheric problems as its main cause but it was quite a coincidence that a CBS 60-minute report had been broadcasted, informing that the Brazilian power grid system was vulnerable to hacker attacks.
To make reasons for blackout even more confused, Maycon Vitali, security researcher and professor at UVV in Vila Velha, Espirito Santo State, Brazil, has issued a post without blog (pt_BR http://blog.hacknroll.com/2009/11/12/a-verdade-sobre-o-apagao/) demonstrating flaws in web security in the ONS site (Brazil’s National Operating System) through which he received thousands of accesses to post which has been heavily commented in the web community, posted in big media vehicles like infoexame, G1, e Band among others, with many referenced made to such post on personal blogs and twitter.
We believe that many people have wrongly analyzed the contents of such post as well as a great (and unnecessary) hullabaloo has been made about this matter.
Below we will comment on such post and some erroneous interpretations.
Firstly, it has been comment about robots.txt in the ONS site:
What’s robots.txt?
As the name says, it is a file in txt format that works as a filter for crawlers, enabling webmasters to control access permissions to specific points in the sites. The robots.txt controls which information item from a site should (or should not) be indexed by the browsing sites. File syntax is very simple and should be placed by the webmaster responsible for the site in the roots of hosting.
In the case of the mentioned robots.txt, it blocks any user-agent that is performing crawler action and in two directories:
User-agent: *
Disallow: /agentes/agentes.aspx
Disallow: /download/agentes/
By accessing the site in the directories that should not be indexed in the browsers we noticed in the links (for some applications like citrix) a web system where the post in the blog was originated.
Based on the post it reports he says that he tried to access an application in the presented list and in the login he tried to use simple inverted commas, thus causing the result below:
[IfxException: ERROR [HY000] [Informix .NET provider]General error.] IBM.Data.Informix.IfxConnection.HandleError(IntPtr hHandle, SQL_HANDLE hType, RETCODE retcode) +27 IBM.Data.Informix.IfxCommand.ExecuteReaderObject(CommandBehavior behavior, String method) +739 IBM.Data.Informix.IfxCommand.ExecuteReader(CommandBehavior behavior) +104 IBM.Data.Informix.IfxCommand.ExecuteReader() +48 OnsClasses.OnsData.OnsCommand.ExecuteReader() IntUnica.Menu.btnOk_Click(Object sender, ImageClickEventArgs e) System.Web.UI.WebControls.ImageButton.OnClick(ImageClickEventArgs e) +109 System.Web.UI.WebControls.ImageButton.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +69 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +18 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain() +1292
As I mentioned in mailling lists and in comments with friends that based on post and error message showed WE CANNOT STATE that there is a SQL injection in the application as it was only an exception (stack) that had been printed on the screen and it would thus be an “A6- Information Leakage and Improper Error Handling” failure. Logically, if we analyze statistics about this type of error, the majority will lead us to find the SQL Injection itself as being an Informix error. Logically, if we check statistics on this type of error, the majority will lead us to find the SQL injection itself as being an Informix error. To find the truth it would be necessary to accomplish tests, what would be illegal as we do not have authorization for such.
What would be an A6- Information Leakage and Improper Error Handling failure?
Several applications may, unintentionally, leak information about their configurations, internal functioning or violate privacy through several problems. Applications can leak their internal functioning via response time to execute specific process or different responses for diverse entries, like displaying same error message but with different error codes. Web Applications will frequently leak information about their internal functioning through detailed error messages or debug. Frequently, these information items can be the path to launch attacks or even more powerful tools.
Conclusions:
- Access to agents cannot be regarded as a failure since robots.txt are globally used so that information cannot be indexed in the searchs as google, yahoo but, when dealing with applications, it would be a best practice to place an access password in the directory /agents/.
- A stack error or inadequate error handling does not mean that the site has some SQL Injection vulnerability in the application, however, one may notice that data input sanitization has not been accomplished.
- We do not know what can be found inside the applications, what gives no reason for such hullabaloo in case a SQL Injection is confirmed to have some relation with the big blackout.
- Internally there they must have perimeter defense tools, stronger authentication engines, but, as informed, such information is based only on assumptions.
- The Brazilian Government should invest more in web security in its environments and logically make use of Web Vulnerability Analysis tools as well as use Web Application Firewall (WAF) and also develop internally a Secure Development LifeCycle (SDLC).
N-Stalker Team