SEO Success Guide, Google's Latest Algorithm, SEO News, SEO Tips at Glance, Upgrade Yahoo Ranking, MSN High Ranking Tips, Get High Google Ranking, Increase Revenue, Ideal PPC Program,

Ahmedabad SEO Expert

Ahmedabad SEO Expert provides quality SEO Services which increase your business growth within few months.

Location: Ahmedabad, Gujarat, India

I am Viral A. Kayasth specialize in SEO Area to Promote Website's & Increase Unique Visitors and Growth of Company. You can contact me at 9825900216, 26636477

Friday, October 13, 2006

followings are the words which are used to communicate in technical SEO side..
SEO Glossary :

Ad sense

Google provides the ad sense features in order to display relevant adds on their website
to earn money give link of Google ad sense page

Ad word

Google's CPC (Cost Per Click) based text advertising. Ad Words takes click through rate into consideration in addition to advertiser's bid to determine the ad's relative position within the paid search results, which is a notable difference from Overture's CPC model that is based purely on bid amount. Google applies such a weighting factor in order to feature those paid search results that more popular and thus presumably more relevant and useful.

Anchor text
Anchor text is heavily used by the search engines. Anchor text is actually a text part of a link

Alt tag
Alt tag used to display the short description of a content to images

A complex mathematical formula used by a search engine to rank the web pages that it finds by crawling the web.

when search engine block your domain IP or url that terms known as banned.
blog are the simply log of a web page also know as webblog which maintain the updated record or content of a site

lists that either search engines or vigilante users compile of search engine spammers, which may be used to ban those spammers from search engines or to boycott them

Bridge page
Also known as doorway page, jump pages to generate traffic to your site from particular search engine doorway pages are used.

Broken link
A link that do not contain the proper web page link or deleted or renamed while clicking on that links produce an error that terms known as broken link

Back link
Backlinks are incoming links to a website or web page. The number of backlinks is an indication of the popularity or importance of that website or page. In basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node

the cost incurred or price paid for a click through to your landing page.

the number of time a link is clicked divided by the number of times a the same link is visited

crawler is known as spider it just a programmed that runs by the search engine which reads the links,metatags,content of the web page and decide the page rank according to the algorithm which it follows

serving the different content to search engines and to the web users that is known as cloaking. The major search engines including Google try to avoid this practice. cloaking is considering like spamming

"content is a king" the best thing to rank #1 in major search engines is to add relevant text content to your site using the main keywords

Doorway pages
Also known as doorway page, jump pages to generate traffic to your site from particular search engine doorway pages are used.

Dynamic site
Human editors groups websites into categories and provide site descriptions or edit descriptions that are submitted to them. With a directory, picking the right category and composing a description rich in key phrases will ensure maximum visibility. Contrast this with a search engine, which is unedited and concerned primarily with the HTML of a site's constituent pages

the directory specifies some categories into which you can submit your site. then the editor will approve it and show the link, description and title of your site on the web.

the web address of you site knows as domain name etc

External link
link which points to the relevant site of you.

Flash is one kind of animation software used to representation your business with some animation, effect, sound etc.

A web page that is made up of frames. A useful analogy: if the individual frames that make up the frameset are the 'children,' then the frameset is the 'parent.

Google bot
Google bot is the name of the google search engine crawler or a spider which checks your web pages and decide the page rank. Google's spider, which deep crawls web sites monthly.

Google Toolbar
google has created the new toolbar with features like display the page rank for each page opened in the web browser)
Popup blocker.
Highlight the words.
spell checker.

Search Engine Optimazation Link popularity Services

Link popularity is a measure of the quantity and quality of other web sites that link to a specific site on the World Wide Web. It is an example of the move by search engines towards off-the-page-criteria to determine quality content. In theory, off-the-page-criteria adds the aspect of impartiality to search engine rankings.

Link popularity plays an important role in the visibility of a web site among the top of the search results. Indeed, some search engines require at least one or more links coming to a web site, otherwise they will drop it from their index.

Search engines such as Google use a special link analysis system to rank web pages. Citations from other WWW authors help to define a site's reputation. The philosophy of link popularity is that important sites will attract many links. Content-poor sites will have difficulty attracting any links. Link popularity assumes that not all incoming links are equal, as an inbound link from a major directory carries more weight than an inbound link from an obscure personal home page. In other words, the quality of incoming links counts more than sheer numbers of them.

To search for pages linking to a specific page, simply enter the URL on Google or Yahoo! this way:


Here are some strategies that are generally considered to be important to increase link popularity:

There should be links from the home page to all subpages so that a search engine can transfer some link popularity to the subpages.

Appropriate anchor text with relevant keywords should be used in the text links that are pointing to pages within a site (technically, this helps link context, not link popularity).

Getting links from other web sites, particularly sites with high PageRank, can be one of the most powerful site promotion tools. Therefore, the webmaster should try to get links from other important sites offering information or products compatible or synergistic to his/her own site or from sites that cater to the same audience the webmaster does. The webmaster should explain the advantages to the potential link partner and the advantages his/her site has to their visitors.
One way links often count for more than reciprocal links.
The webmaster should list his/her site in one or more of the major directories such as Yahoo! or the Open Directory Project.

The webmaster should only link to sites that he/she can trust, i.e. sites that do not use "spammy techniques".
The webmaster should not participate in link exchange programs or link farms, as search engines will ban sites that participate in such programs.

To increase link popularity, many webmasters interlink multiple domains that they own, but often search engines will filter out these links, as such links are not independent votes for a page and are only created to trick the search engines. See Spamdexing. In this context, closed circles are often used, but these should be avoided, as they hoard PageRank.

@This content is Copy right of wikipedia

Search engine optimization Ahmedabad Gujarat,Improve your web rank and triffic

Search engine optimization (SEO) is a set of methods aimed at improving the ranking of a website in search engine listings, and could be considered a subset of search engine marketing. The term SEO (Search Engine Optimizers) also refers to an industry of consultants who carry out optimization projects on behalf of clients' sites. Some commentators, and even some SEOs, break down methods used by practitioners into categories such as "white hat SEO" (methods generally approved by search engines, such as building content and improving site quality), or "black hat SEO" (tricks such as cloaking and spamdexing). White hatters charge that black hat methods are an attempt to manipulate search rankings unfairly. Black hatters counter that all SEO is an attempt to manipulate rankings, and that the particular methods one uses to rank well are irrelevant.

Search engines display different kinds of listings in the search engine results pages (SERPs), including: pay per click advertisements, paid inclusion listings, and organic search results. SEO is primarily concerned with advancing the goals of a website by improving the number and position of its organic search results for a wide variety of relevant keywords. SEO strategies may increase both the number and quality of visitors. Search engine optimization is sometimes offered as a stand-alone service, or as a part of a larger marketing effort, and can often be very effective when incorporated into the initial development and design of a site.

For competitive, high-volume search terms, the cost of pay per click advertising can be substantial. Ranking well in the organic search results can provide the same targeted traffic at a potentially significant savings. Site owners may choose to optimize their sites for organic search, if the cost of optimization is less than the cost of advertising.

Not all sites have identical goals for search optimization. Some sites seek any and all traffic, and may be optimized to rank highly for common search phrases. A broad search optimization strategy can work for a site that has broad interest, such as a periodical, a directory, or site that displays advertising with a CPM revenue model. In contrast, many businesses try to optimize their sites for large numbers of highly specific keywords that indicate readiness to buy. Overly broad search optimization can hinder marketing strategy by generating a large volume of low-quality inquiries that cost money to handle, yet result in little business. Focusing on desirable traffic generates better quality sales leads, resulting in more sales. Search engine optimization can be very effective when used as part of a smart niche marketing strategy.

* This content is taken from wikipedia

Free Seo Friendly Direcotry and engine list

Here i presents valuable resources of Free Seo friendly directory and engine list for All

Tuesday, October 10, 2006

Facts of the week: Google and latent semantic analysis

Rumor in search engine forums has it that Google is giving more weight to latent semantic indexing (LSI) in its latest ranking algorithm update.

What is latent semantic indexing?

LSI means that a search engine tries to associate certain terms with concepts when indexing web pages. For example, Paris and Hilton are associated with a woman instead of a city and a hotel, Tiger and Woods are associated with golf.

Google has been using this concept to determine suitable ads for its AdSense service for some time now. It seems that Google is now also using this concept to improve the quality of its search results.

If you search for a keyword on Google and add a tilde ~ before the search term, then you get an idea of what Google thinks about a search term.

For example, if you do a semantic search for phone, Google returns Nokia as the first result. A normal search for phone returns different results. Adding a tilde to the search term seems to instruct Google to do a semantic search.

Why is this important to your SEO activities?
If Google uses this concept in its ranking algorithm (which is very likely) then its advisable that you don't focus on a single keyword, but on a set of related keywords with your search engine optimization activities.

You should optimize some of the pages on your web site for keywords that are synonymous with the keyword you're targeting.

When you exchange links with other web sites, make sure that you vary the link text to your web site so that it contains different variations and synonyms of your keyword.
Where can I find further information about LSI?

Latent semantic indexing helps search engines to find out what a web page is all about. It basically means to you that you shouldn't focus on a single keyword when optimizing your web pages and when getting links.

The web pages on your web site should be related and focus mainly on a special topic while using different words that describe the topic. Use variations of your keyword and synonyms. That makes it easier for search engines to determine the topic of your site.
Source : Free:SEO:News

Wednesday, October 04, 2006

How To Beat Google Brandy Update

Alex Walker :
The Brandy update seems to have incorporated some pre-Florida results (another major update that occurred at the end of 2003), mixed with numerous new factors. Google stores its index on a number of data centers around the world. Since 'Florida', some of the old data centers were taken offline, and pundits believe that Google has kept the old SERPs (Search Engine Results Pages) in a preserved state for the last few months.

Indeed, Google brought these data centres back at the same time that Yahoo! broke from Google, in favour of its new Inktomi-based results. Consequently, I don't think this is the last of the major changes we'll see in Google, but it does seem that Google is getting closer to what it aims to achieve.

Five ChangesBrin, one of the founders of Google recently said,

Google has made five significant changes to its algorithmic formulas in the last two weeks. (Associated Press (AP), Feb 17th 2004) While we can only guess at what those changes were, the following are probably a good bet.

Increase in Index Size :

Google's spider, Googlebot, has had a busy few weeks -- at the time of the update, Google announced that it had massively increased the size of its index.
This move was probably made to ensure Google made headlines at the same time as Yahoo! (for example, in this report in the BBC News, Feb 18th 2004). However, in order to increase the index size, Google may have had to re-include some of the pre-Florida results that had previously been dropped.

Latent Semantic Indexing (LSI)

This is a very significant new technology that Google has always been interested in, and the incorporation of LSI has been on the cards for some time. If you are an insomniac, then Yu et al.'s paper is quite helpful in explaining the concept, but, in short, LSI is about using close semantic matches to put your page into the correct topical context.
It's all about synonyms. LSI may see Google effectively remove all instances of the search keyword when analysing your page, in favour of a close analysis of other words. For example, consider the search term 'travel insurance'. LSI-based algorithms will look for words and links that pertain to related topics, such as skiing, holidays, medical, backpacking, and airports.

Links and Anchor Text

Links have always been the essence of Google, but the engine is steadily altering its focus. The importance of Page Rank (PR), Google's unique ranking system, is being steadily downgraded in favour of the nature, quality, and quantity of inbound and outbound link anchor text. If PR is downgraded, and the wording of inbound links is boosted, this may explain, to a large degree, the position in which many sites currently find themselves.
For example, most people will link to a site's homepage. In the past, due to internal linking structures, PR was spread and other pages benefited. Now, it is more important for Webmasters to attract links that point directly to the relevant pages of their sites using anchor text that's relevant to the specific pages.
Furthermore, Google seems to be using outbound links to determine how useful and authoritative a site is. For example, directories that are doing well are those that direct link to the sites, rather than use dynamic URLs.


Now, more than ever, has the question of who's linking to your site become critical. Links must be from related topic sites (the higher the PR the better); those links are seen to define your 'neighbourhood'. If we again consider the example of travel insurance, big insurance companies might buy links on holiday-related sites in order to boost their ranking. These businesses will actively invest in gaining targeted inbound links from a broad mix of sites. Consequently, their neighbourhoods appear tightly focused to Google.

Downgrading of Traditional Tag-Based Optimisation

Clever use of the title, h1, h2, bold, and italics tags, and CSS, is no longer as important to a site's ranking as it once was. It is very interesting to listen to Sergey (co-founder of Google) talk about this, because he's the one usually quoted about the ways in which people manipulate his index. Google has taken big steps to downgrade standard SEO techniques in favour of LSI and linking, which are far less manipulable by the masses. The Impact of BrandyThese changes make for sober reading if you're a Webmaster -- to optimize your site successfully for Google has become a lot more difficult. Nevertheless, there are a number of practical steps that can be taken to promote your ranking in the short and long term.


As LSI appears to be so significant, it is important to start looking carefully at the information architecture of each major section of your site, and to increase the use of related words. It is also important to re-examine the title tags to include this concept; good title tags have synonyms and avoid repetition of the key phrase.

Outbound Links

Link to authority sites on your subject. In the travel insurance example, these authority sites could include places like the State department, major skiing directories, etc. Not only will this help with LSI, it also allows Google to define the neighbourhood more easily. Furthermore, you could engage in link swaps with other companies so that you gain the benefit of an on-topic, LSI-friendly link.
Inbound Links and Link to Us Pages

Based on what we have just said, sites need to formulate a link development strategy. A budget needs to be set aside to buy links and develop mini-sites. Look to set up links with university sites (.edu or, as these seem to be valuable given Google's informational bias.
Each section of a site should have its own link-to-us page. For example, HotScripts, the major computer script directory, has a great link-to-us page.

By providing people with creatives and cut-and-paste HTML, you can vastly improve your chances of attracting reciprocal links to your site. You'll need to have a separate page for each section, to maximise on-topic inbound links.


It is important to develop separate mini-sites (also known as satellite sites) for each key subject of your Website. This is a useful tactic that improves your chances of appearing in the SERPs for your keywords. Furthermore, as the last three Google updates have shaken things up so much, having more than one site reduces the likelihood that your business will be disrupted by the engine's updates. However, Google is likely to view satellite sites as spam, so you must take some steps to reduce the chances of your being blacklisted on this basis.
First, make it as hard as possible to for Google to detect host affiliation between your main site and its mini-sites. Google may define sites to be owned by the same person if the first 3 octets of the sites' IP addresses are the same (e.g. Therefore, if you're going to run mini-sites, put them on different Web hosts.
Secondly, use different domain names for your mini-sites, rather than sub-domains of your main site. In the past, Google has not penalised sub-domains, but the early results from the Brandy update show a considerable reduction in the presence of sub-domains in the SERPs.
Finally, be very careful with the linking strategy you use between mini-sites -- Google will look at the linking structure very critically. Don't plaster each of your sites with links to the others, and don't reciprocate links between the sites.
Mini-sites make it easier to create on-topic neighbourhoods and experiment with LSI techniques. Creating a large network can be a means to boost your main site's rank, but make sure you're well aware of the risks involved with creating these mini-sites before you embark.

Thursday, September 28, 2006

How do I add my site to Google's search results?

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses software known as "spiders" to crawl the web on a regular basis and find sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when our spiders crawl the web.

To determine whether your site is currently included in Google's index, just perform a search for your site's URL. For example, a search for [ ] returns the following results: design of the site makes it difficult for Google to effectively crawl its content.

The site was temporarily unavailable when we tried to crawl it or we received an error when we tried to crawl it. You can use Google webmaster tools to see if we received errors when trying to crawl your site.
Our intent is to represent the content of the internet fairly and accurately. To help make this goal a reality, we offer guidelines as well as tips for building a crawler-friendly site. While there's no guarantee that our spiders will find a particular site, following these guidelines should increase your site's chances of showing up in our search results.

Consider creating and submitting a detailed site map of your pages using Google Sitemaps. Google Sitemaps is an easy way for you to submit all your URLs to the Google index and get detailed reports about the visibility of your pages on Google. With Google Sitemaps, you can automatically keep us informed of all of your current pages and of any updates you make to those pages. Please note that submitting a Sitemap doesn't guarantee that all pages of your site will be crawled or included in our search results.

How can I create a Google-friendly site?

Give visitors the information they're looking forProvide high-quality content on your pages, especially your homepage. This is the single most important thing to do. If your pages contain useful information, their content will attract many visitors and entice webmasters to link to your site. In creating a helpful, information-rich site, write pages that clearly and accurately describe your topic. Think about the words users would type to find your pages and include those words on your site.

Make sure that other sites link to yoursLinks help our crawlers find your site and can give your site greater visibility in our search results. When returning results for a search, Google combines PageRank (our measure of a page's importance) with sophisticated text-matching techniques to display pages that are both important and relevant to each search. Google counts the number of votes a page receives to determine its PageRank, interpreting a link from page A to page B as a vote by page A for page B.
Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important." Please note that ranking of sites in our search results is completely automated, and we don't manually assign keywords to sites.

Keep in mind that our algorithms can distinguish natural links from unnatural links. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors. Unnatural links to your site are placed there specifically to make your site look more popular to search engines. Some of these types of links (such as link schemes and doorway pages) are covered in our webmaster guidelines.

Only natural links are useful for the indexing and ranking of your site.
Make your site easily accessibleBuild your site with a logical link structure. Every page should be reachable from at least one static text link.

Use a text browser, such as Lynx, to examine your site. Most spiders see your site much as Lynx would. If features such as JavaScript, cookies, session IDs, frames, DHTML, or Macromedia Flash keep you from seeing your entire site in a text browser, then spiders may have trouble crawling it.

Consider creating static copies of dynamic pages. Although the Google index includes dynamic pages, they comprise a small portion of our index. If you suspect that your dynamically generated pages (such as URLs containing question marks) are causing problems for our crawler, you might create static copies of these pages. If you create static copies, don't forget to add your dynamic pages to your robots.txt file to prevent us from treating them as duplicates.

Source :

Tuesday, September 26, 2006

Are Search Engine Spiders Infringing on Copyright? Belgium Speaks Out

Last Friday (Sept 15 th) a Belgium court dealt a stunning blow against Google and its Google News search service. The court is now forbidding the popular search engine from indexing Belgian newspaper content without paying each newspaper for the use of their content.

The ruling requires Google to remove the plaintiff’s newspaper content from its search engine database within 10 days or face threatened fines of 1,000,000- € per day. In addition, Google must publish “in a visible and clear manner and without any commentary from her part the entire intervening judgment on the home pages of ‘’ and of ‘’ for a continuous period of 5 days within 10 days… under penalty of a daily fine of 500,000- € per day of delay.” (source website and original legal document)

This ruling signifies a strong precedent for other newspapers to follow and ultimately brings up a tempting legal option for all of those sue-happy “people” out there; “if my site is copyrighted… can I sue Google for indexing it?” The fact is newspapers with an online presence are bound to jump in on the action and try to get a little financial love from the major search engines and precedent like this just urges them on.

Danny Sullivan published an extremely informative article today where he describes his interview with the Belgian group that led this successful case against Google. In this article he notes that Google CEO Eric Schmidt cut to the chase and sensibly summarized this legal nightmare as “business negotiation being done in a courtroom.” I must agree because there is no question the path news companies are taking to get their needs met is ass-backwards.

Just consider the following:

Google drove traffic to Belgian news sites ultimately making these sites money; their advertisers got more visibility and they got more subscribers.

Belgian news sites took Google to court to have their copyrighted material removed because they felt Google should not be able to use it without paying for it.

By winning their case Belgian news sites have now been removed from Google and ultimately the news sites will lose money by their online exposure being severely decreased.

To put this in another perspective if this was your business; would you spend money to go to court with the ultimate goal of losing money by losing online presence in the hopes that Google will pay you to get your content back later? Seems like a silly gamble to me and based entirely on the pursuit of ego.

The simple fact is that newspapers make money by getting free exposure from search giants like Google. Now they are biting the hand that feeds them. In that vein, the World Association of Newspapers (WAN) is taking a different, more sensible step to locate a technological solution to the problem without going immediately to court.Based in Paris, WAN represents 18,000 newspapers from around the world. According to WAN, two options have been considered at this early juncture:

Pay Royalties: search engines that pay an essential royalty will then be allowed to present copyrighted content within their search engine results for a limited time. If they don’t pay then the content will be blocked and this must be respected by the search engines as a whole.
New Robot Limiting Capabilities: the search engines need to acknowledge and follow an instructive file that could be provided by news sites which would define limits for the use of copyrighted information on their site.

How serious is WAN? Apparently a number of WAN members have separately launched legal proceedings against Google over the “Napsterisation” (illegal use) of stories on its website.
Here is a January 31 st article discussing this issue: Newspaper, Magazine and Book Publisher Organizations to Address Search Engine Practices.

Oil holds above $61, spotlight on OPEC

By Randy Fabi : LONDON (Reuters) - Oil held above $61 on Tuesday after rebounding from a six-month low as dealers focused on whether OPEC might trim output should prices fall further.

U.S. crude eased 31 cents at $61.14 a barrel by 1240 GMT, while London Brent slipped 547 cents to $60.26 a barrel.

Oil prices rebounded on Monday, settling up 90 cents, after the market briefly slipped below the psychologically important $60 level earlier in the session.

"The general sentiment is that if prices fall further, OPEC might decide to come in and cut its quota," said Andrew Harrington, an industry analyst at ANZ.

"Given the language that some OPEC members are using, the market is interpreting that a price below $60 would be a trigger point for the group to act," he added.

The Organization of the Petroleum Exporting Countries, which pumps more than a third of the world's oil, is concerned about a drop in oil prices but has no plans to call an emergency meeting ahead of its scheduled December 14 meeting in Nigeria, OPEC sources said on Monday.
U.S. crude has tumbled around 20 percent since its $78.40-peak in mid-July, taking back virtually all of this year's gains.

OPEC has avoided setting a target oil price to defend. Saudi Oil Minister Ali al-Naimi, who steers the policy of the world's biggest exporter, said last week that prices were "reasonable."

Wednesday, September 20, 2006

Quality SEO Dirctory at Your Door Step

Acquire Quality directory List

Sunday, August 13, 2006

Ahmedabad SEO Expert, Ahmedabad SEO Services

Get Help of Best SEO Services at Affordable price from Viral Kayasth at (Ahmedabad SEO Expert, Ahmedabad SEO Services )

What is Search Engine Optimization ( SEO ) ?

Now a day, there are huge millions of people using internet who are continually searching for new techniques to promote their business products with a view to avail quality growth for their business. The Top most and even small - midcap corporate companies wish to do the SEO for their sites to get a huge traffic towards their company. SEO is the process of increasing the amount of visitors to a Web site by ranking high in the search results of a search engine. The higher a Web site ranks in the results of a search, the greater the chance that site will be visited by a user. Ultimately more no. of visitors would increase your valuable inquiry & thus finally your sales revenue - result increase.

Why Search Engine Optimization ( SEO ) Required For Your Company ?

Today, in global competitive era, there are several peoples who are searching for online products & furnish the online order as well as business dealings, due to lot of busy schedule. The chances of getting inquiry to your site is possible only when your site came at 1st , 2nd or 3rd page in Google, MSN, Yahoo Search Engine Results. It’s a globally accepted fact that most of the visitors came to your business through website & it’s only possible your site rank well Thus SEO is the crucial object to rank well your site & increasing growth for your business.

How I can help you ? ( Ahmedabad SEO Expert, Ahmedabad SEO Services )

I have tremendous experience of more than 1 Year in SEO Field. I have successfully worked on UK Client Sites as well as big company sites which ranked well since last couple of months & their unique visitors increase day by day. I have latest tools & techniques which achieve your site at rich place in major search engines like Google, Yahoo & MSN.

Successful SEO Sites & It’s Current Result.
Some of the proved results of some of the sites which I have worked on.

Site Keyword Results Rank Page

Data Entry Keyword

--> Google 3 rd rank
--> Yahoo 5 th rank

Data Processing Keyword

--> Google 5 th rank
--> MSN 3 rd rank

IT Outsourcing Keyword

--> Google 4 th rank
--> Yahoo 6 th rank
--> MSN 15 th rank

Get Cheap SEO Services at Ahmedabad, Gujarat.

Contact me ( Viral A. Kayasth ) at 26636477, 9825900216 for Affordable SEO Services & See the difference to your business growth. You can email me at & at your convenient time. (Ahmedabad SEO Expert, Ahmedabad SEO Services )