Tuesday, June 4, 2019

Website Quality Evaluation Based on Sitemap

meshwork come out Quality Evaluation Based on SitemapM.Chandran A.V.RamaniAbstractWeb internet lay quality evaluation can be made based on creating site map for the WebPages for a single website which works decorously. A website is taken for the analysis where we check every tie-in under that website argon checker and split it match to experimental condition polity. By analyzing every status label from all those web scallywag tie in, we are ignoring every different refer except the rapscallion contains status code 200. at that placefore we are developing the sitemap for those linkups which is working perfectly.Keyword Sitemap, Website, Search Engine Optimization, SMGA.1. IntroductionWebsite are something entirely new in the world of software quality, at heart transactions of going live .The World simple Web has made the spread of information and ideas easy and accessible to millions. Its the place where everyone has an opportunity to be comprehendthat is, if you ca n be found amidst the stack of other Web sites out there. Every WebPages has their own characteristics and this characteristic has drawbacks and benefits.1There are many dimensions of quality, and each card give pertain to a particular website in varying degrees. Here are some of them time, a credible site should be updated frequently. The information most latest update also should be included on the home rogue. However, if the information has non been updated currently, the visitor could advantageously bond laid that perhaps the site manager does in truth bother to update the site. Second is structural, all of the parts of the website hold together and all links inside and outside the website should work well. Broken links on the webpage also are another factor that always downgrades the quality of website. Each page usually has references or links or connections to other pages. These may be internal or external web site. A user expects each link to be valid, meaning that it leads successfully to the intended page or other resource. In a 2003 experiment, discovered that about one link out of every 200 disappeared each week from the Internet 1. The third factor is subject matternumber of the links, or link popularity isone of the off page factors that hunting engines are looking to determine the value of the webpage. Most of search engine will need a website to obligate at least two links pointing to their site before they will place it to their index, and the idea of this link popularity is that to increase the link popularity of a website, this website must have large amount of high quality content. Number of links to website improves access growth and helps to generate traffic 2.PR(A) = (1-d) + d(PR(t1)/C(t1) + + R(tn)/C(tn))PR = page rankt1 tn = are pages linking to page AC = is the number of outbound links that a page asD = is a damping factor, usually garnish to 0.85.Search engine such Google make a citation analysis to rank hits, then a w ebsite which has a any links to it will have a higher ranking compare than a website with a few links. This indicator can be used to measure the quality of web site. quaternate is response time and latency, a website server should respond to a browser request within certain parameters, it is found that extraneous content exists on the legal age of popular pages, and that blocking this content buys a 25-30% reduction in objects downloaded and bytes, with a 33% decrease in page latency. Popular sites averaged 52 objects per page, 8.1 of which were ads, served from 5.7 servers 3, and object command processing overhead now dominates the latency of most web pages 4. Following theThe first step would be to be sure your sitemap is up to date to begin with and has all the URLs you want. The chief(prenominal) thing is none of them should 404 and then beyond that, yes, they should return 200s. Unless youre dealing with a enormous site which might be hard to maintain, in theory there shou ldnt be errors in sitemaps if you have the correct URLs in there. Getting sitemaps right on a large site made a huge deviation to the crawl rate and a huge indexation to follow 3. With growth of web-site content its getting harder and harder to manage relations between singular WebPages and assert track of hyperlinks within a site. Unfortunately there are no perfect web-site integrity rotating shafts or services that can enforce proper relationship between pages, keep track of moving content, webpage renames etcetera and update corresponding URLs automatically. Modern content management systems and blog software may aggravate the occupation so far more by Replicating the same dead web links across numerous web-pages which they generate dynamically, so people can be getting 404 errors much more frequently.4SitemapSitemaps, as the name implies, are just a map of your site i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Si temaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines.3. all-important(prenominal) sitemap errors that could affect our rankingsThe first step would be to be sure your sitemap is up to date to begin with and has all the URLs you want (and not any you dont want). The main thing is none of them should 404 and then beyond that, yes, they should return 200s. Unless youre dealing with a gigantic site which might be hard to maintain, in theory there shouldnt be errors in sitemaps if you have the correct URLs in there. Getting sitemaps right on a large site made a huge difference to the crawl rate and a huge indexation to follow.2. Problem Definition. Every webpage invention has their own characteristics and this characteristic has drawbacks and benefits. There is a mechanism for measuring the effects of the webpage component toward the exertion and quality of website. This mechanism will measure size, compo nent, and time infallible by the client for downloading a website. The main factor that will influences this download time are page size (bytes), number and types of component, number of server from the accessed web. explore conducted by IBM can be used as a standard for performance measurement of quality 7. Standard international download time for this performance can be used as a reference to categorize the tested webpage. subsequently we have done with data, and then continued by examen of data.Table1. Standard of the website performanceFour Reasons to keep Site MapA site map is literally a map of your Web site. It is a tool that allows visitors to easily get around your site. Having a well constructed site map is not only important to constitute a positive experience for your electric potential customers, but is an important aspect of search engine optimization. Below are 4 functions of a site map.NavigationA site map provides an organised list of links to all the pages on your Web site. If visitors get lost while browsing your site, they can always refer to your site map to describe where they are and get where they would like to go. Site maps allow your visitors to navigate your Web site with ease.ThemeWhen visitors access your site map, they will learn a mound about your Web site within a very short period of time. A well constructed site map will allow visitors to easily and efficiently grasp your site.Search Engine Optimization (SEO)Since a site map is a single page that contains links to every page on your Web site, it is a very effective way to help search engine spiders crawl through your site with ease.Since search engines rely on links to disclose the main pages of your site, a site map is a great way to get every page on your site indexed by the search engines. The more pages you have indexed by the search engines, the more potential you will have to reach a greater number of prospective clients. The World Wide Web has made the spread of information and ideas easy and accessible to millions. Its the place where everyone has an opportunity to be heardthat is, if you can be found amidst the multitude of other Web sites out there.Search Engine Optimization (SEO) is the process of making your Web site accessible to people using search engines to find services you provide. Search engines (such as Google, Yahoo, and Bing) operate by providing users with a list of relevant search results based on keywords users enter. This allows people who dont know your Web site correspondress to find your site through keyword searches 1.Some basic features of Web sites that search engine spiders look for areKeyword usage, Keyword placement, Compelling content, hypertext markup language title tags, meta-descriptions andKeyword tags, External and internal links, Site updates, Site map, Web design, Functionality.Effective keyword usage is not simply based on repeating a keyword or phrase over and over on your Web site.OrganizationA si te map enables you to easilyassess the structureof your site to see where your site is strong and where it is weak. Whenever you need to add new content or new sections to your Web site, you will be able to take the existing hierarchy into consideration by glancing at your site map.1Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap. Sitemaps can be compressed usinggzip, reducing bandwidth consumption. Multiple sitemap files are supported, with aSitemap indexfile serving as an entry point.Sitemap indexfiles may not list more than 50,000 Sitemaps and must be no larger than 10MiB(10,485,760 bytes) and can be compressed. You can have more than oneSitemap indexfile 23. MethodologiesThis research stages will start with problem identification followed by research procedure and sample of data explanation.Nature of invalid hyperlinksWith growth of web-site content its getting harder and harder to manage relations between individual WebPages and keep track of hyperlinks w ithin a site. Unfortunately there are no perfect web-site integrity tools or services that can enforce proper relationship between pages, keep track of moving content, webpage renames etc, and update corresponding URLs automatically. With time this causes some hyperlinks becomeobsolete,stale,dangling, and simply deadbecause they dont lead to valid pages anymore, and web-users are going to get404response codes or other unsuccessful HTTPresponses each time when they try to access the web-pages. Modern content management systems and blog software may aggravate the problem even more by replicating the same dead weblinks across numerous web-pages which they generate dynamically, so people can be getting 404 errors much more frequently.Important of onlinelink checkerDue to lack of adequate problem detection tools (aka URL validators, web spiders, HTML crawlers, websites health analyzers etc) its really very hard to identify what tiny local and outbound hyperlinks became dead, and its ev en harder to fix those because in order to do so you need to know precise attitude of the broken linking tag in the HTML code without that you will need to scan through thousands source lines to find exact HREF (or other linking sub-tag) that causes the problem.Sample DataIn order to get data for this research, we examined Ramakrishna mission portals were not randomly selected, but a careful process was undertaken. Rather than selecting any generic 5At the beginning of the process we are giving the website link. As we can see that the status of that website, whether it presents or not. By the analysis of this functionality we can able to get the status code of the website link. As shown in the figure, the domain name, ip address and server name with status code would be displayed. If the website status code is 200 then the website link that we gave is completely ok. If the website link we gave is broken or deleted than it will display the 404 status code error.Constructing Tree Str ucture by Applying Site Mapping Generation Algorithm (SMGA)MAPGEN(Di)GenRoot(m1,.mn)// Getting result client for menusFor i0 to nFor j0 to fsj=GetChild(mi,j)// For getting child nodeEnd ForIf si==NULLAddNode(mi,NULL)//No child node for rootElseAddNode(si, mi)// adding child to root nodeEnd IfEnd ForFor all m, sDomain4. Result and DiscussionIn Table-2, we are giving a website link (http//www.srkv.org) to test whether that link present or not. After receiving that the status of that link as 200, we are examining whether that link has site map or not. If we got to know that the website does not have the site map, we are moving to the succeeding(a) step of process.Table-2From Table-3, we are exploring how many links totally that website contains. With the help of that data we are processing every single link that we got to receive the status code of that link. By categorizing that we are splitting them into number of collection sorted by status code. We are developing the site map fo r the link which has the status code 200. We are ignoring the rest of the links from that website.Table3. Dynamic website(www.srkv.org) List of errors with status code .Table-4 shows the common status code that occurs often with description and comment.When the received links which has the status 200, we could confirm that the link of that website link is working fine. When the received code is 404, the requested page or the URL is not available or unknown location to the server. When the received status code is 522, the requested web server is currently down or unavailable due to traffic.Table-4 take in-1 represents in form of chart which the data that collected from the Table-2. From the chart we can understand that the status code of the website link data has drawn where the 404 status code occurs often than others.Figure-1First step for creating site map we need a site to analyse the WebPages under that site. For that we are taking a link (www.srkv.org) for creating site map. Af ter reading every page under that link, we can get a table of content which has a series of links with the status code. We found that the total link contains under that website is 193. By categorizing those pages according to the status code of the every link. The total link that contains 200 Status code is 84. The total link that contains the 404 status code is 104. The total link that contains 410 Status code is 4. The total link that contains Status code 522 is 1. By ignoring all the links that contains status code except 200. We can only create site map for the link which contains the 200 status code.Development of the sitemap GeneratorThe sitemap which shown in the Figure-2 had generated with the help of above algorithm. The algorithm defines the process to show the result by means of root node and child node. If the link is Title it adds to the root node else it add that sub title link as child node.Figure -25. ConclusionWe taking a required link of the website to check all th e links under that which has the status code 200 and ignoring the remaining error links. The page which works completely has been taken for creating sitemap based on SMGA algorithms. Search Engine optimization Looks for sitemap in every website for the ranking system in every interrogate search. We are developing the sitemap for the website which already do not have the sitemap within. When the SEO found the sitemap in a website then it would increase the ranking.ReferencesFrank McCown, M.N., and Johan Bollen, The Availability and Persistence of Web References in D-Lib Magazine, in the 5th International Web Archiving Workshop and Digital Preservation (IWAW05). 2005.Larry Page, R.M., Sergey Brin,Terry Winograd. The Anatomy of a Large-Scale Hypertextual Web Search Engine. Stanford.Krishnamurthy, B.a.C.W. Cat and walk content Delivery Tradeoffs in Web Access. in WWW 2006. Edinburgh, Scotland.Yuan, J., Chi, C.H., and Q. Sun. A More recise Model for Web Retrieval. in WWW 2005. 2005. Ch iba, Japan.Team, I.W.S., Design for Performance abbreviation of Download Times for Page Elements Suggests Ways to Optimize. 2001.Information on Helping Spiders Crawl through your Web Site available at, http//sonicseo.com/helping-spiders/last accessed at 18 September, 2013.Information on sitemaps http//en.wikipedia.org/wiki/SitemapsSitemap_index/,last modified on 15 September 2013,Information on Free Broken Link Checker /OnlineURLValidatorhttp/brokenlinkcheck.com/, last accessed at 18 September, 2013Handaru Jati and Dhanapal Durai Dominic Quality Evaluation of E-Government Website using Web Diagnostic ToolsAsian Case,2009 International Conference on information management and Engineering ,2009 IEEE.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.