StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

The Security Aspect and Hacking Techniques: the Usual Nature of Services - Research Paper Example

Cite this document
Summary
This paper would take up the Web 2.0 technologies and various frameworks that are developed for the correct ascertainment of the enveloping world. It also encompasses the Web 2.0 languages and protocols for a better discussion on the issue…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER95.6% of users find it useful
The Security Aspect and Hacking Techniques: the Usual Nature of Services
Read Text Preview

Extract of sample "The Security Aspect and Hacking Techniques: the Usual Nature of Services"

Abstract Sharing ones interests and collaboratively take account of one’s identity in a virtual world cannot be legalized with strict regulations. The security aspects of Web 2.0 and its technologies are illustrated at large. The very change of hands of authority for providing dynamic and wishful content for the various implications would make sure that users feel good, important and validated interacting with the web 2.0 elements. The security risks come in place with the very incoming of the lateral behavior and suspicious attitude to jump into others shoes to dismay and illicitly take control of others privileged arenas. The various technologies that construct the very Web 2.0 world namely Ajax and XML are discussed at large with a variety of opinions and vulnerabilities for securing it from malicious intentions. The threats which are taken at large would be based on cross-site scripting, client side validation, prototype theft, SQL injection and many more. The technological aspects which would make the discussion complete makes sure that all the various security penetrations are addressed to cover the pitfalls at large. All the security aspects are discussed with proposed solutions for each one of them. The correct blend of technology, sound engineering principals, positive attitude and non-malicious aspect would make the journey of web 2.0 to enhance and empower the users in a confident manner. Introduction Web 2.0 and its accompanying technologies are addressed to its various extents and make sure that all the vulnerabilities are addressed to its possible extent. The first section would take up the Web 2.0 technologies and various frameworks that are developed for the correct ascertainment of the enveloping world. It also encompasses the Web 2.0 languages and protocols for better discussion on the issue. The second section makes sure that all the various Web 2.0 security concerns are highlighted to its very extent and are also discussed to their various solutions. The various vulnerabilities require to be addressed for fetching the right kind of attitude in solving them. All the various security policies like flash hacking policies, SOAP, REST and XML-RPC hacking and other aspects are discussed. The next section discusses the Ajax security and the associated vulnerabilities for arriving at a solution. Discovering the security threats educates the users and the developers to make a move towards greater engineering principals and make sure to capitalize on the same. The various decisions regarding the security policies make sure that the Ajax technology is explored really well for greater penetration and discovery. The conclusion provides an overview of the overall work and addresses the accomplishments made by the Web 2.0 security threats. It summarizes the entire paper and puts emphasis on the very organization of the Web 2.0 and its various security threats. Web 2.0 technology standards: The primary drivers of the Web 2.0 are the basic emergence of various web technology and standards. The variety of aspects would make sure that all the related technology is identified and properly stated for the purpose. 1. Ajax: With the incoming of Web 2.0 Ajax forms the major breakthrough in fetching the right product. AJAX stands for Asynchronous JavaScript and XML which invokes automatic refresh at the background after the page stops loading. It has a greater advantage that smaller amount of information passes to the server for updation and reduces the access time it usually did in case of synchronous behavior of the web pages. The information updates automatically and utilizes the bandwidth speed well. Figure 1: Asynchronous nature of AJAX (URL:< www.securityfocus.com/infocus/1868>) 2. REST Representational state transfer (REST) is software architecture which makes sure that all the resources are defined and addressed by the way in using a global identifier. To modify these resources one requires using the communication protocol (HTTP) client and server to exchange demonstration of the messages. It facilitates improved response times and server loading. The client side software would be less required and makes the complexity less on the client side. It is precisely made to scale hypermedia distribution with its architectural neutral style. The REST principals state that application states and operations are defined as resources which has links share a common interface for the transfer of state. Figure 2: REST derivation by style constraints. 3. SOAP W3Schools.com (2008) mentions that SOAP stands as Simple Object Access Protocol is an application level protocol as a transport level protocol. Using SOAP the HTTP permit safer communication behind proxies, firewalls. It is a versatile approach to allow various users of different transport protocols. The primary feature is that it is platform dependent, language dependent, quite simple and extensible. These features make sure that all it is compatible across platforms. It provides an efficient way to communicate among various applications. The operation modeling is quite doubtful and the POST based binding is often a security threat. Figure 3: SOAP in the Web Architecture (URL:http://books.google.co.in/books?id=LEpPzQ5mRDoC&pg=PA71&lpg=PA71&dq=SOAP%2Bmodel&source=web&ots=1Ng3M1Fbja&sig=Ktccw9CqiEK5QJDLdel9zQxSzdo&hl=en#PPA72,M1) Web 2.0 Hacking methods and Security concerns: The risks which are exposed for Web 2.0 would be to highlight the various vulnerabilities that come along with the technology. The various security concerns arose as a result of various individuals who passionately developed intelligent programs with negative and illicit attitude and make sure that all the various concerns are highlighted to its full extent. The basic derivation of such nature is spread in the form of various Malwares and intentional programs which result in harmful activities for a particular business or internet community at large. The various security concerns are dependent on the technologies in use like Ajax and other majority of technologies used. The composite feature of the security concerns are as follows: 1. Content exploitation: It is quite often seen in practice wherein the content is misrepresented with wrong and misleading information about the subject. Often the content which is uploaded and saved by users are not properly referenced like Wikipedia and are nor taken as valid sources for fetching content in many universities. The various objectives which are desired to be achieved would be quite misleading in such cases and make sure that all the various thoughts are achieved to its maximum limit. 2. Identity theft: Websites like MySpace.com, FaceBook.com and Orkut.com assist a person to have a representation over the internet and allow each other to express one, create a community of friends, joining other communities and make new ones. When over the internet people love dealing with other human beings than to information or machines. Internet is adopted by a larger portion of the population of people who are not necessarily technophile and who are definitely more interested in the “human and social life”, it makes human role more prominent. Their participation in digital forums would attract the community of interest like chatting, posting exciting stories, news events, Blogs, video, playing games and many others. Interestingly, even the more traditional information perspective is becoming extended with social aspects helping to better manage this information: For instance opinion and social translucence mechanisms (Erickson, 2002) are used in electronic marketplaces such as E-Bay to facilitate the evaluation of the quality and the relevance of product information and coordination mechanisms are used for instance in Wikipedia to facilitate the collaborative construction of an online encyclopedia. The major factor of identity is theft of one’s actual self being and this poses a greater theft in maintaining the profiles. One can post illicit comments for making the other self feel embarrassed or pass the blame to someone else. 3. Access control: The various models to allow people to their access areas, protecting document and profile format, save passwords and various others is quite useful for making the large amount of internet malicious users to harm their self being and online identity. The various access areas for inviting the people to join their friends circle and using keywords to know their profile. The security policies at this stage would be to disallow any unknown person to make the right move for getting the job done. To take in enough ethics for fetching the positive attitude for prevailing the correct methodology for communicating across other people and their communities. 4. Denial of service attacks: The various attacks regarding the concerns to stop or slow someone from accessing a particular website or information. There are some service denial attacks for financial information and to make sure that right and timely information does not reach the intended recipient. 5. How to safeguard on security aspects? The Exploit prevention labs have devised strategies for safeguarding themselves against malicious programs and practices. Figure 4: Exploit Intelligence Network It takes care of the following issues: a. It possesses a network of hunting pots and nodes globally to actively monitor cyber-criminal activities. b. Monitoring of human activity and terrorism activities and hacker mailing lists c. Monitors SiteID which is a tool that digs into the ownership to determine whether the site is really operated by the actual owner who claims it. It further employs a correlation engine to collect all the above research data for most of the sites and assembles it real time providing exploit-specific protection in minutes (Pat, 2006). The LinkScanner is a windows application which provides real-time, automatic protection against malicious web sites and other cybercriminal activities. LinkScanner technology’s dynamic approach which scans a website and its content. It provides timely and accurate analysis of several websites and avoids suspicious activities and vulnerabilities. Other ways to do it are as follows: Protects against exploits, hacked web sites, phishing, social engineering, malicious lure sites, and adware server attacks Advises whether a site can be visited second time or not. Takes care of data streaming and data handling, also takes care of packet filtering and fragmentation of packets. Announces which sites are safe for browsing and innovation It checks the entire URL for any site insecurities and other security aspects. Web 2.0 hacking methods The various hacking methods for Web 2.0 can be either client side or server side vulnerabilities. Special tools and techniques like Apache Jackrabbit which is the content repository and implementation of Java Content Repository and serves as a standard for flexible, query able for structured content. The Apache cocoon that serves as an XML framework makes application flexible and is an excellent integrating tool (Trieloff, 2008). 1. Scanning technique and challenges: It is basically classified into two sections namely: Server side applications: The buried resources of the server like the links and the assets that are belonging to the particular application. The scanning would solve the various backend Web services, mashups, proxies and many other services which are prone to security attacks. The web services like JSP and ASP services must be taken care for better deployable methods. Client side applications: The client side languages like JavaScript requires to be checked for making the scenario really great as they are concerned with vulnerable attacks. The concentration on the server side code must be increased for various complexities to smoothen out well. 2. RSS and ATOM feed attacks The following are the vulnerabilities with respect to RSS feeds: It can be possible that the owner is malicious which at the first place would derive the vulnerabilities more. The RSS provider is hacked and in turn if it controls others it would definitely be infected globally. CgiSecurity (2008) mentions “An attacker deciding to inject malicious payloads into a feed rather than deface the site has a greater chance of evading detection for a longer period of time, and thus to affect more machines”. Often the Web-based feeds are generated from mailing lists, bulletin board messages, P2P websites, Bit Torrent sites or user postings on a Blog. This makes them vulnerable to have malicious payload. When transporting over proxy cache poisoning there are greater risks of malicious damages. 3. SOAP, REST and XML-RPC hacking The parsing methodology for XML-RPC hacking is quite a security issue as without that one is not able to fetch the right information. The POST method for sending requests are quite prone to hacking using REST technology and are often prone to attacks for stealing parameters send with the URL. 4. Flash hacking Flash hacking is quite interesting from change of pixel, correction of distortions and fabricating them, masking and so on. The basic feature is to change the picture pixel strength and intensity to such an extent that it recreates a different image with varying properties. The masking features overtake the picture objective with the help of timers. 5. Fuzzing and code review methodologies and tools The fuzzing and code review methodologies are the fall backs for various web 2.0 technologies which are exercised for checking the code before deployment and even periodic checks for any accidental or malicious code changes. The review schemes are adopted with any engineering product and are a part of every product. The tools are as follows: Crucible – It is a web based tool for detecting change in code and version and source control software (Altassian.com, 2008). JCodeReview – It is an open source Java software for toolkit Ajax Security Ajax as it stands for Asynchronous JavaScript and XML. It combines the features of both client and server side scripting. It is a brand new technology that is verifiable for larger Web 2.0 concepts and usage widely. The following screen takes care of the following Ajax scenario. Figure 5: AJAX asynchronous scripting a. Ajax Client-side security controls Security implemented through client side controls are some of the major flaws in the software. The dependence of the client side possibilities is quite crucial to the success of the intruders in fetching and penetrating the right action for the website. The absence of the enough server side scripting code makes it vulnerable to more attacks as any technical person with little knowledge of HTML can modify the client side code and insert their intelligent code or make a false website with such malicious code. It is extremely insecure and so the best approach would be to make it more of server side than client side. Figure 6: AJAX Code depicts Vulnerability at client side (URL: http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VJG-4NGKDY6-4&_user=10&_rdoc=1&_fmt=&_orig=search&_ sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=fd538e050413237f063d7131d3e54841) b. Ajax Cross-Site Scripting (XSS) It is responsible for making the vulnerabilities more visible as they are accountable for XSS holes in a process where multiple threading concepts are not used at all. The primary aspects are in the form that when a user browses with a single thread operation, it becomes quite susceptible to a variety of issues and essential data which can be private in form can be tracked very easily. As Ajax was introduced an attacker can exploit Cross Site Scripting issues largely (Kassoff, 2005). Enough engineering principals in place make sure that specialized security testing must be performed prior to moving the application into production to address the vulnerable issues. Conclusion The security aspect and hacking techniques exposes users to Service Oriented Architectures (Andrews, 2006). The development of the service-based endpoints introduces more vulnerabilities as Ajax includes much possibility to push itself away from the standard three-tier model arises. The usual nature of services over the web was focused towards e-commerce activities and typically for B2B and thereafter the designers and developers overlooked the phenomenon of interacting with the end users at large (Fielding, 2005). The differential thinking posed by the Web 2.0 led the developers see things from a variety of angles and finally concluded that security is quite weak and must be furnished at will. This led to some bad security assumptions. If one assumes that middle tier system would handle authentication and input validation, the incoming of Ajax makes sure that all such cases are accessed by the user and implemented at the client side. References/Bibliography A. Cirillo, R. Jagadeesan, C. Pitcher, and J. Riely (2007). Do As I SaY! Programmatic access control with explicit identities. In 20th IEEE Computer Security Foundations Symposium. Altassian.com (2008). Crucible — efficient code review. Retrieved 20 April 2008 from http://www.atlassian.com/software/crucible/ Anderson, Paul .,2007. JISC:What is Web 2.0? Ideas , technologies and implication for education. ANDREWS, G. 1991. Paradigms for process interaction in distributed programs. ACM Computing Surv. 23, 1 (March 1991), 49–90. CgiSecurity (2008).Zero Day Subscriptions: Using RSS and Atom Feeds as Attack Delivery Systems. Retrieved 20 April 2008 from www.cgisecurity.com/papers/RSS-Security.ppt. CLARK, D. D. AND TENNENHOUSE, D. L (2002). Architectural considerations for a new generation of protocols. In Proceedings ACM SIGCOMM‘90 Symposium (Philadelphia, PA, Sept. 1990), 200–208. DAVIS, F., ET. AL (2003). WAIS interface protocol prototype functional specification (v.1.5). Thinking Machines Corp., Apr. 1990. FIELDING, R. T (2005). Relative uniform resource locators. Internet RFC 1808, June 1995. Erickson, T., Halverson, C., Kellogg, W. A., Laff, M. and Wolf, T. (2002); "Social Translucence: Designing SocInfrastructures that Make Collective Activity Visible". Communications of the ACM (Special Issue on Community, ed. J. Preece), Vol. 45, No. 4, pp. 4044. M. Kassoff, L.-M. Zen, A. Garg, and M. Genesereth (2005). PrediCalc:A logical spreadsheet management system. In 31st International Conference on Very Large Databases (VLDB), pages 1247–1250. Pat (2006). Securing Web 2.0: Why Security 1.0 is no longer enough. Trieloff, Lars (2008). Rapid Web 2.0 Hacking with Jackrabbit, Cocoon, Dojo. Retrieved 20 April 2008 from http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd W3Schools.com (2008).Introduction to SOAP. Retrieved 20 April 2008 from http://www.w3schools.com/soap/soap_intro.asp Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(The Security Aspect and Hacking Techniques: the Usual Nature of Research Paper, n.d.)
The Security Aspect and Hacking Techniques: the Usual Nature of Research Paper. Retrieved from https://studentshare.org/technology/1720319-some-aspect-of-the-concept-of-web20
(The Security Aspect and Hacking Techniques: The Usual Nature of Research Paper)
The Security Aspect and Hacking Techniques: The Usual Nature of Research Paper. https://studentshare.org/technology/1720319-some-aspect-of-the-concept-of-web20.
“The Security Aspect and Hacking Techniques: The Usual Nature of Research Paper”, n.d. https://studentshare.org/technology/1720319-some-aspect-of-the-concept-of-web20.
  • Cited: 0 times

CHECK THESE SAMPLES OF The Security Aspect and Hacking Techniques: the Usual Nature of Services

Blackhole Exploit Kit

These characteristics include configuration options for the usual parameters such as redirect URLs, file paths, query string parameters, passwords, and usernames.... The Blackhole exploit kits are based on a MySQL and PHP backend and incorporate support for exploiting the most vulnerable and widely used security flaws with the purpose of providing hackers with the highest successful exploitation probability (Rajaraman, 2011)....
7 Pages (1750 words) Research Paper

Customer Loyalty and Retention

Conducting a quantitative research is very beneficial as it allows the researcher to gain a better view of its customers and enables the company to improve its services based on these feedbacks that are gained.... Hence this paper has touched upon these two aspects of the business and a quantitative research plan has been discussed in this paper....
10 Pages (2500 words) Essay

How ISPs Can Help Fight Botnets and Cybe

Botnets is one of the newer techniques that is adapted by hackers to gain access to different systems on the network and then perform inappropriate automated tasks through them.... They adopt a distributed approach due to which it becomes difficult to control or detect them.... They are the basis of many internet crimes like spam, phishing, denial of service attacks etc....
14 Pages (3500 words) Essay

Blackhole Exploit Kit - What They Are and How They Work

It is a type of crimeware that takes advantage of exploits that are unpatched with the view of hacking computers through malicious scripts that are planted on legitimate but compromised websites.... The Blackhole exploit kits are based on MySQL and PHP backend and incorporate support for exploiting the most vulnerable and widely used security flaws with the purpose of providing hackers with the highest successful exploitation probability (Rajaraman, 2011)....
7 Pages (1750 words) Case Study
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us