Enable javascript in your browser to view an important message.

Monday, 31 October 2011

Chat Bot

Chat robot, a computer program that simulates human conversation, or chat, through artificial intelligence. A "chatterbot" or "chat bot" is a bot whose purpose is to simply communicate with a client. From humble beginnings such as the early program "ELIZA," chat bots have grown in purpose and scope. Typically, a chat bot will communicate with a real person, but applications are being developed in which two chat bots can communicate with each other. Chat bots are used in applications such as e-commerce customer service, call centers and Internet gaming. Chat bots used for these purposes are typically limited to conversations regarding a specialized purpose and not for the entire range of human communication.
Windows Live Messenger, also known as MSN, is an instant messaging program which allows you to add and communicate with other users. A chat bot is a computer program that can be used with Windows Live Messenger; it registers the messages you send and simulates a human response, allowing you to carry out a realistic, interesting conversation.

References
Amber Viescas, eHow Contributor, What Are Web Bots?, Retrieve in July 25,2011
URL:

Nade Xro, eHow Contributor, How to Add Chat Bots on MSN, Retrieve in May 25, 2011
URL:

Friday, 28 October 2011

Web Bot Contest Answer is Released!!



Across
1. GOOGLE--A search engine spider
3. WEBBOT--Retrieves information from websites
4. INTERNETBOTt--A software application that does repetitive and automated tasks in the Internet
5. SPAMBOT--Send spam emails and typically work by automated
6. HOTBOT--Using the Inktomi database

Down
1. SHOPBOT--A special type bot that are use to check prices
2. SPIDER--Another name of web crawler
3. GOOGLEBOT--Crawling, Indexing and Serving is the key processes in delivering search result


For all the participants to our blog contest will stop at this moments. There are 4 participants in our contest. Thanks to those participants who joined our contest and congratulated to the those 3 participants that are win in our contest! The following name are the participant that win the prize in our conest is

1. Lee Horng win the first prize 4GB pendrive x1 !

2. Boo Kuok Chai win the second prize External USB hub x1 !

3. Gan Shi jet win the third prize Optical Mouse x1 !

Continue to follow latest Webbot information in our blogs, Twittle and Facebook page. Thanks!

Thursday, 27 October 2011

Fifth Poll result Analysis


The title of the poll last week is “Do you guys know what is Malware Bots?” We had created a pie for votes results.



The above figure showed that there are only total 2 votes. 1 person voted “Yes” and the other voted “I don’t know what is that.”

Conclusion, according to the chart above, only 1 person voted that he know about Malware Bots. Which means that there is people know what is Malware Bots. Lastly, we will continue to improve our blog and share more interesting information to the readers.

Web Scraping & techniques

Web scraping (also called Web harvesting or Web data extraction) is a computer software technique of extracting information from websites. Usually, such software programs simulate human exploration of the World Wide Web by either implementing low-level Hypertext Transfer Protocol (HTTP), or embedding certain full-fledged Web browsers, such as Internet Explorer or Mozilla Firefox. Web scraping is closely related to Web indexing, which indexes information on the Web using a bot and is a universal technique adopted by most search engines. In contrast, Web scraping focuses more on the transformation of unstructured data on the Web, typically in HTML format, into structured data that can be stored and analyzed in a central local database or spreadsheet. Web scraping is also related to Web automation, which simulates human Web browsing using computer software. Uses of Web scraping include online price comparison, weather data monitoring, website change detection, Web research, Web mashup and Web data integration.

Techniques for Web scraping
Web scraping is the process of automatically collecting Web information. Web scraping is a field with active developments sharing a common goal with the semantic web vision, an ambitious initiative that still requires breakthroughs in text processing, semantic understanding, artificial intelligence and human-computer interactions. Web scraping, instead, favors practical solutions based on existing technologies that are often entirely ad hoc. Therefore, there are different levels of automation that existing Web-scraping technologies can provide:

1.) Human copy-and-paste: Sometimes even the best Web-scraping technology cannot replace a human’s manual examination and copy-and-paste, and sometimes this may be the only workable solution when the websites for scraping explicitly set up barriers to prevent machine automation.

2.) Text grepping and regular expression matching: A simple yet powerful approach to extract information from Web pages can be based on the UNIX grep command or regular expression matching facilities of programming languages (for instance Perl or Python).

3.) HTTP programming: Static and dynamic Web pages can be retrieved by posting HTTP requests to the remote Web server using socket programming.

4.) DOM parsing: By embedding a full-fledged Web browser, such as the Internet Explorer or the Mozilla Web browser control, programs can retrieve the dynamic contents generated by client side scripts. These Web browser controls also parse Web pages into a DOM tree, based on which programs can retrieve parts of the Web pages.

5.) HTML parsers: Some semi-structured data query languages, such as XQuery and the HTQL, can be used to parse HTML pages and to retrieve and transform Web content.

6.) Web-scraping software: There are many Web-scraping software tools available that can be used to customize Web-scraping solutions. These software may attempt to automatically recognize the data structure of a page or provide a Web recording interface that removes the necessity to manually write Web-scraping code, or some scripting functions that can be used to extract and transform Web content, and database interfaces that can store the scraped data in local databases.

7.) Vertical aggregation platforms: There are several companies that have developed vertical specific harvesting platforms. These platforms create and monitor a multitude of “bots” for specific verticals with no man-in-the-loop, and no work related to a specific target site. The preparation involves establishing the knowledge base for the entire vertical and then the platform creates the bots automatically. The platforms robustness is measured by the quality of the information it retrieves (usually number of fields) and its scalability (how quick it can scale up to hundreds or thousands of sites). This scalability is mostly used to target the Long Tail of sites that common aggregators find complicated or too labor intensive to harvest content from.

8.) Semantic annotation recognizing: The Web pages may embrace metadata or semantic markups/annotations which can be made use of to locate specific data snippets. If the annotations are embedded in the pages, as Microformat does, this technique can be viewed as a special case of DOM parsing. In another case, the annotations, organized into a semantic layer, are stored and managed separately from the Web pages, so the Web scrapers can retrieve data schema and instructions from this layer before scraping the pages.

The above example are the techniques of Web scraping.


http://www.mozenda.com/web-scraping

SenseBot

Sensebot represents a new type of Search Engine that delivers a summary in response to your search query instead of a collection of links to web pages. SenseBots parses top results from the web and prepare a text summary of them. The summary serves as digest on the topic of your query, blending together the most significant and relevant aspects of the search results. The summary itself becomes the main result of your search. SenseBot's approach is unique and protected by Patent Pending.
SenseBot is a semantic search engine, meaning that it attempts to understand what the result pages are about. It uses text mining to parse web pages and identity their key semantic concept. It then performs multidocument summarization of content to produce a coherent.
Search engine are as much the first source of information as they are the starting point for research. Summarization of content of query result is an innovative technique to obtain an intelligent response from the system. This is the concept behind SenseBot and I am glad that Dmitri Soubbotin took the time to answer a few question on the development at SenseBot.

References
ARUN RADHAKRISHNAN, Summarization, Interview with Dmitri Soubbotin of SenseBot, Retrieved in December 11,2007.
URL

Clever Bot



Cleverbot is Carpenter’s latest chatting AI and uses a growing database of 20+ million online conversations to talk with anyone who goes to its website, and it can also say as computerized program that you can chat with. Cleverbot learns from humans and has Conversations with you.
At this stage in its development, talking to Cleverbot is like having a text conversation with a monkey tied to a typewriter as
it is being flung down a flight of stairs. Eventually though, automated conversationalists will become a staple of entertainment and business.
Already robotic voices answer the phone at many large corporations. One day Cleverbots may provide companionship that we won’t be able to discern from the real thing.
The Cleverbot test took place at the Techniche festival in Guwahati, India. Thirty volunteers conducted a typed 4-minute conversation with an unknown entity.
Half of the volunteers spoke to humans while the rest chatted with Cleverbot. All the conversations were displayed on large screens for an audience to see.
The picture above is test at http://www.cleverbot.com/
cleverbot.com is an website chat with a machine, and it will automatic reply user query that ask by user.

Comment

Reference
Jacob Aron,Software tricks people into thinking it is human, Retrieve in September 2011
URL:

Michael Davidson, How to outsmart a Clever Bot, Retrieve in 12 july 2011

Aaron Saenz, Cleverbot Chat Engine Is Learning From The Internet To Talk Like A Human,
Retrieve in 13 January 2010
URL:

Cleverbot

Wednesday, 19 October 2011

Web Bot-Webscraper

Webscraper
There is some really great webscraper software now on the market. Webscraper software can be an invaluable tool in the building of a new business and in any endeavor requiring extensive research. The new generation of programs incorporates an easy to use GUI with well-defined options and features. It has been estimated that what normally takes 20 man-days can now be performed by these programs in only 3 hours. With a reduction in manpower, costs are trimmed and project timelines are moved up. The webscraper programs also eliminate human error that is so common in such a repetitive task. In the long run, the hundreds of thousands of dollars saved are worth the initial investment.

Video of a screen scraper software, Competitor Data Harvest:
http://www.youtube.com/watch?v=uU91vOsS6xc&feature=player_embedded

Extract Data from the website
Data extraction from a site can be done with extraction software and it is very easy. What does data extraction mean? This is the process of pulling information from a specific internet source, assembling it and then parsing that information so that it can effectively be presented in a useful form. With a good data extracting tool the process will take a short time and very easily. Anyone can do this not to mention the simplicity it comes in when looking to extract and store data to publish it elsewhere. The software to extract data from the website is being used by numerous people, with amazing results. Information of all types can be readily harvested.

Free Data Mining Software
There is now a lot of free data mining software available for download on the internet. This software automates the laborious task of web research. Instead of using the search engines and click click clicking through the pages tiring and straining the eyes then using the archaic copy and paste into a second application, it can all be set to run as we relax and watch television. Data mining used to be an expensive proposition that only the biggest of businesses could afford but now there is free data mining software that individuals with basic needs can use to satisfaction. Many people swear by the free programs and have found no need to go further.

Create data feeds
Creating data feeds especially the RSS is the process of distributing the web content in an online form for easy distribution of information.RSS has enabled the distribution of content from the internet globally. Any type of information located in the web can be created into a data feed whether it is for a blog or a news release. The best thing about using a web scraper for these purposes is to ensure that your information is easily captured, formatted and syndicated. Cartoonists and writers in the newspaper business create data feeds for their work to be disseminated to readers. This process which has enhanced the sharing of information to many people has been made possible.

Deep Web Searching
The deep web is the part of the Internet that you can not see. It can not be found by traditional search engines and bots to find all the data and information. Deep web searching needs to be done by programs that know specifically how to find this information and extract it. Information found in the deep web are pdf, word, excel, and power point documents that are used to create a web page. Having access to this information can be very valuable to business owners and law enforcement, since much of it is all information that the rest of the public can not access.

Credits to:
http://www.mozenda.com/web-bot

Tuesday, 18 October 2011

Crossword puzzle

Attention! Our crossword puzzle is released. To those people who are interested to join our contest and fill up all the answer then send to author mail.
Good Luck!!!










Across

  1.  A search engine spider
  2.  other name of web crawler
  3.  retrieves information from websites
  4.  A software application that does repetitive and automated tasks in the Internet
  5. Send spam emails and typically work by automated
  6. using the Inktomi database 

Down

  1. A special type bot that are use to check prices
  2. Other name of web crawler
  3. Crawling, Indexing and Serving is the key processes in delivering search result.
Malware Bots



Like any program, a bot can be used for good or for evil. "Evil" web crawlers scan web pages for email addresses,
instant messaging handles and other identifying information that can be used for spamming purposes. Crawlers can also engage in
click fraud, or the process of visiting an advertising link for the purpose of driving up a pay-per-click advertiser's cost. Spambots send
hundreds of unsolicited emails and instant messages per second. Malware bots can even be programmed to play games and collect prizes from some websites.


Malware is type of bot which allows an attacker to gain complete control over

the affected computer.Computers that are infected with a 'bot' are generally referred to as 'zombies'.
There are literally tens of thousands of computers on the Internet which are infected with some type of 'bot' and don't even realize it.
Attackers are able to access and activate them to help execute DoS attacks against web sites, host phishing attack Web sites or send out thousands of spam email messages. Should anyone trace the attack back to its source, they will find an unwitting victim rather than the true attacker.
A dos can say as denial of service. malware bots that are used for injecting viruses or malware in websites, or scanning them looking for security vulnerabilities to exploit are most dangerous.
It’s no surprise that blogs and ecommerce sites that become target of these practices end up being hacked and injected with porn or Viagra links. All these undesirable bots tend to look for information that is normally off limits and in most situations, they completely disregard robots.txt commands.

Dectecting infection

The flood of communications in and out of your PC helps antimalware apps detect a known bot. "Sadly, the lack of antivirus alerts isn't an
indicator of a clean PC," says Nazario. "Antivirus software simply can't keep up with the number of threats. It's frustrating [that] we don't
have significantly better solutions for the average home user, more widely deployed."Even if your PC antivirus check comes out
clean, be vigilant. Microsoft provides a free Malicious Software Removal Tool. One version of the tool, available from both Microsoft
Update and Windows Update, is updated monthly; it runs in the background on the second Tuesday of each month and reports to Microsoft whenever it finds and removes an infection.

Reference
Amber Viescas, What are web bots[online], Retrieved in 25 july, 2001
URL:
http://www.ehow.com/info_8783848_bots.htmlg

Tony Bradley, What is a bot?[online]
URL:
http://netsecurity.about.com/od/frequentlyaskedquestions/qt/pr_bot.htm

Augusto Ellacuriaga, Practical Solutions for blocking spa, bot and scrapers, Retrieved in 25 june,2008
URL:
http://www.spanishseo.org/block-spam-bots-scrapers

Review of HotBot

HotBot, owned by Terra/Lycos, is one of older Web search engines. Originally it just used the Inktomi database and then added Direct Hit and the Open Directory. Then in Dec. 2002, it relaunched as a multiple search engine with Inktomi, Fast, Google, and Teoma. In July 2003, they stayed with the same four databases, but renamed them HotBot, Lycos, Google, and Ask Jeeves. Lycos was dropped in March 2004. This review covers HotBot using the Inktomi database, which they now call "HotBot." See the Google and Teoma (Ask Jeeves) reviews for more details on how their database and interface work, bearing in mind that not all features are available at HotBot. The basic search screen shows no options, but choose Advanced Search for the full range of search features. To see how HotBot used to work, see the old Search Engine Showdown Review. Use the table of contents on the left to navigate this review.

Databases: HotBot offers the choice of three search engine databases:
  • HotBot (which is actually a Yahoo!/Inktomi database, and the version reviewed here)
  • Google
  • Ask Jeeves (the Teoma databases
Strengths
  • Advanced searching capabilities
  • Quick check of three major databases
  • Advanced search help
Weaknesses
  • Does not include all advanced features of each of the four databases
  • No cached copies of pages
  • Only displays a few hits from each domain with no access to the rest in Inktomi
Reference:
Greg R. Notess, Review of HotBot(Online), Retrieved April 15, 2004

Friday, 14 October 2011

IBM WebFountain

WebFountain is an Internet analytics engine implemented by IBM for the study of unstructured data on the World Wide Web. IBM describes WebFountain as a set of research technologies that collect, store and analyze massive amounts of unstructured and semi-structured text. It is built on an open, extensible platform that enables the discovery of trends, patterns and relationships from data.

The project represents one of the first comprehensive attempts to catalog and interpret the unstructured data of the Web in a continuous fashion. To this end its supporting researchers at IBM have investigated new systems for the precise retrieval of subsets of the information on the Web, real-time trend analysis, and meta-level analysis of the available information of the Web.

Factiva, an information retrieval company owned by Dow Jones and Reuters, licensed WebFountain in September 2003, and has been building software which utilizes the WebFountain engine to gauge corporate reputation. Factiva reportedly offers yearly subscriptions to the service for $200,000. Factiva has since decided to explore other technologies, and has severed its relationship with WebFountain.

WebFountain is developed at IBM's Almaden research campus in the Bay Area of California.

IBM has developed software, called UIMA for Unstructured Information Management Architecture, that can be used for analysis of unstructured information. It can perhaps help perform trend analysis across documents, determine the theme and gist of documents, allow fuzzy searches on unstructured documents

http://www.redbooks.ibm.com/abstracts/redp3937.html
http://news.cnet.com/IBM-sets-out-to-make-sense-of-the-Web/2100-1032_3-5153627.html
Good News!!!
Hi everyone my name is Jason, Liew Wee Sheng, Chew Chu Chiang. We are organizing a crossword puzzle contest that topic are related to our blog. There are some interesting prize to give away to participant. First prize we will give out a 4 GB pendrive, Second prize we will give out a external usb hub and the third prize we will give an optical mouse.
Are you guys interested with our prize? Come to get our prize, just follow a few step to join our contest. First go to our blog www.webbotcly.blogspot.com then download the form and fill it that already given. The last send the form to our author mail.
So What are you guys waiting for, come to join our contest immediately, if you have any problem can send email to us.
Join Our Contest Now!!!

The video above is promoting our crossword puzzle contest, If your guys are interested, just fill in the form send it to our author mail. Thank you!

Facts on the Spambot

Basic
Spambots are designed to send spam emails and typically work by automated means, hence the word "bot." The messages are pre-typed and sent en masse to various email addresses. Responding to a spam bot will result in a return message that mimics a person, but is in actuality just a program. The emails will frequently contain links to where people can purchase the services or products the spambots claim to offer.
Forum Bots
Spambots will also post on forums and message boards. They will submit bogus content in an attempt at target marketing or display advertisements; promote a particular product or service while pretending to be an actual person. It may be difficult at times to discern whether or not a post is genuine or spam.
Email Harvesting
Spambots collect email addresses from postings on the Internet. Forums, blogs and websites may contain email addresses of potential consumers, which the spambot scans and adds to its database. It uses programs called spiders that scan the posted email addresses from web pages. The bots can use these email addresses and the web pages they were found on to send unsolicited advertisements to customers or even fake web pages designed to look official. These fake pages are meant to provoke customers into entering login information, which can lead to identity theft. This technique is called "phishing."

Reference
Brenton Shields, Facts on the Spambot, retrieved June 27, 2011

Fourth Poll Analysis














In this poll is talking about Web Bot keyword predicting helpful or not. The picture above was show the result of the vote. There are 13 voter in this question. There are 10 person who are vote keyword prediction is helpful, and 3 person vote Sometimes. In the pie chart , show that
73 % voter are vote in helpful, and 27% are vote sometimes and 0% vote no.

Conclusion
The poll result show that keyword prediction is helpful that agree by 13 voter. In the 13 voter there are no voter vote in" no it's not ", this show that keyword prediction are really helpful. But there are 3 voter vote in sometimes, therefore Keyword prediction still no in perfect.

Thursday, 13 October 2011

Shopbot



A shopbot is a special type of webbot. I also can be call as shopping robot, it can be use it to check prices with many retailers on the webbot. Shopping bot are comparison shopping web sites that help consumers find the best offered by online retailers. Shopping bots operate in similar way to search engines. This bot like software can send out on a mission, usually to find prices from different online store and report back to the user, so sometimes user called it as a intelligent agents

Type of Shopping Bot
There are two general types of shopping bots. The first is the broad-based type that searches a wide range of product categories such as MySimon.com and PriceScan.com. These sites operate using a Yellow Pages type of model, in that they list every retailer they can find.

Evaluating Shopping Bots

Even the best bots can't provide guarantees for finding the lowest price or search all retailers. Consumer advocates recommend evaluating prices found by at least three shopping bots before making a purchase.

Some bots are biased in what they report to consumers and give preferential treatment to their marketing partners. Results given to users may only be from retailers that pay a fee to be listed, so users may never learn about bargains from reputable low-cost sellers. Some bots sort prices so that the results from their marketing partners always appear at the top, rather than sorting by lowest price.

Helpful Shopping Bot Features:

  • Easy navigation
  • Well-organized results
  • Sorting by total cost (including shipping, handling, taxes, and restocking fees)
  • Prices from a large number of retailers
References
Gregg Keizer, PCW Print Web shopping: Bots and beyond, Retrieved November 28, 2001 Retrived in
URL:

David Dinning, What are spiders& Web Bot?, Retrieved July 16,2011
URL:



Friday, 7 October 2011

Web Bot: 1.2 billion dead in BP oil spill, Nov. 2010 nuclear war. Accurate? Will ETs intervene?

The Web Bot technology is now predicting a 1.289+ billion mega-death resulting from an “ill-wind” and the BP Gulf oil disaster. Researcher Clif High has published a prediction expecting a ‘tipping point’ around November 8, 2010 into global nuclear war, triggered by a mistaken Israeli-influenced attack on Iran that could come anytime after July 11, 2010.
The BP Gulf of Mexico oil catastrophe was arguably foreseen in prophetic scenarios as set out in various sacred prophecy texts such as the Book of Revelation and the Hopi Prophecy’s Seventh Sign, as well as the visions and prophecies of 20th and 21st century psychics, extraterrestrial contactees, and shamans such as psychic Edgar Cayce (1877-1945), Argentina’s ET contacteeBenjamin Solari Parravicini (1898-1974) and Zulu shaman Credo Mutwa (1921 – present).
The Web Bot ALTA report and Zulu shaman Credo Mutwa – both of whom arguably accurately predicted the BP Gulf of Mexico oil catastrophe before it occurred - are now predicting the BP Gulf oil catastrophe may one of the largest single human depopulation events in history.
A Web-Bot ALTA report dated June 21, 2010 predicts that “1.289+ billion people” may die from the catastrophic effects of the April 20, 2010 BP oil spill and related environmental impacts in a period starting mid-July 2010. The Web-Bot ALTA REPORT states that “The [oil volcano] subset continues to gain support in support of the [ill winds] area, and is still gaining support for those subsets indicating that 1.289+ billion people will perish as a result of the [ill winds] and the [oil volcano].”
According to Web Bot, this high death figure may come as a result of interactivity between the impact of the BP oil catastrophe and an expected global nuclear war starting around the period commencing November 8, 2010.
Zulu shaman and noted author Credo Mutwa on January 7, 2010 predicted an oil-related catastrophe, approximately two and one-half months before the April 20, 2010 BP Gulf oil spill occurred. On January 7, 2010, an individual who reportedly had just attended a meeting with Zulu shaman and author Credo Mutwa in Africa posted the following message on an internet chat board, “Credo Mutwa apparently just now said half the worlds population won’t see 2011 at a gathering where I'm attending. Some delegates have walked out because he didn't want to give an acceptable explanation, he just said ‘it's no asteroid, comet, plague, ... just OIL’
The current world population is estimated at 6.7 billion. If half the world’s population were to die of causes that can be originally tied to the BP Gulf oil catastrophe (as well as nuclear war), that would mean that approximately 3.3 billion persons would die if Credo Mutwa’s prediction came true. It should be noted that an alternative multi-dimensional ‘reading’ regarding the accuracy of Zulu shaman Credo Mutwa’s psychic prediction indicated that Credo Mutwa’s information may have been derived from “lower astral” dimensions influenced by reptilian factors that Mr. Mutwa tends to focus on. A review of the literature reveals a number of hypothetical worst-case scenarios for the ecological, biosphere, economic and social impact of the BP oil spill, as well as intentional international destabilization resulting in global nuclear war. One of these worst case scenarios is that the BP oil spill (and a possible 2010 global nuclear war) are part of an intentional depopulation plan, undertaken and designed by a Rothschild-Rockefeller led (or possible grey-reptilian extraterrestrial influenced) Malthusian elite to eliminate a substantial portion of present humanity – from one billion persons to half or more of our current human population. Examiner.com has reported there exists empirical research based on direct reports of abducted persons connecting a grey and hybrid extraterrestrial intervention strategy to a global environmental catastrophe. An analysis of the BP oil spill worst case scenarios and of the Web Bot and ALTA report technology itself suggests, however, that the Web Bot predictions may based on memes generated by the Web Bot and other worst case predictions themselves. In this case, a hacking of the rense.com website and illegal distribution of the Web Bot ALTA report containing the prediction of 1.2 billion dead from the BP oil spill may itself have led to the self-fulfilling meme magnification effect in the Web Bot prediction.
Examiner.com has reported on this short-coming of the Web Bot technology in our reporting on the “2012 catastrophe meme”: “(A) 2012 meme - One possibility is that the Web Bot technology may be detecting the presence on the Internet of an escalating meme regarding a ‘repetition of a Carrington-type [solar flare] event during the 2012-13 solar maximum,’ rather than an actual future event. This is the more probable 2012 reality.” Mr. High has also exhibited a tendency to refer readers to cataclysmic 2012-13 pole shift scenarios that are scientifically implausible.
The same methological shortcoming may apply to the Web Bot prediction of global nuclear war by November 2010. Such maverick leaders as former Cuban President Fidel Castro Ruz in a June 27, 2010 letter predicts a Web Bot-like scenario whereby a global nuclear war would erupt out of a U.S.-Israeli attack on Iran and would itself disrupt the world’s food supply and have incalculable effect on the environment. The Web Bot may be processing future memes of catastrophe and reifying them into actual war. Regarding the Web Bot’s reports of possible global nuclear war by Nov. 2010, Examiner.com has reported on the concerns of extraterrestrial civilizations about the detrimental dimensional impacts of possible global nuclear war. Examiner.com has reported that “There is converging objective predictive evidence, expert opinion, exopolitical policy analysis, and extraterrestrial contactee communications supporting a hypothesis that large scale “wild card” event(s), involving mass extraterrestrial (UFO) sightings or landings over major urban or other visible centers on the planet for peaceful purposes may occur during the period leading up to 2011-12 or beyond.”
Thus, should an unwise U.S. and Israeli attack on Iran escalate into global nuclear war, it is plausible, by objective evidence, that extraterrestrial civilizations may intervene to prevent such a war (or in aid of a false flag ET invasion).
This Examiner.com article explores the evidence that, despite the Web Bot and ALTA reports predictions of mega death from the BP oil disaster and possible 2010 (and beyond) nuclear war, the objective reality is that (1) mega death will probably not occur; (2) global nuclear war will not occur; and (3) Extraterrestrial civilizations may intervene to prevent a global nuclear holocaust destroying the ecology of Earth and humankind, or as a false flag ET operation to impose a global oppressive dictatorship on humanity.

Alfred Lambremont Webre, Seattle Exopolitics Examiner June 30, 2010
URL
http://www.examiner.com/exopolitics-in-seattle/web-bot-1-2-billion-dead-bp-oil-spill-nov-2010-nuclear-war-accurate-will-ets-intervene

Webbot Command Line Syntax

This is share about webbot command line syntax from w3c

The generic syntax is webbot options , url, and keywords
Robots.txt and HTML META tags

There are situations where you may not want the robot to behave as a robot but more as a link checker in which case you may consider using these options:

-norobotstxt
If you for some reason don't want the robot to check for a robots.txt file then add this command line option
-nometatags
If you for some reason don't want the robot to check for HTML robots related META tags then add line option

Distribution and Statistics Features
Note that if you are using SQL based logging then the set of statistics that can be drawn directly from the database is very high

-charset [ file ]
Specifies a log file of which charsets (content type parameter) were
encountered in the run and their distribution
-format [ file ]
Specifies a log file of which media types (content types)
were encountered in the run and their distribution
-hit [ file ]
Specifies a log file of URIs sorted after how many times they were referenced in the run
-lm [ file ]
Specifies a log file of URIs sorted after last modified date. This gives a good overview of the dynamics of the web site that you are checking.
-rellog [ file ]
Specifies a log file of any link relationship found in the HTML LINK tag (either the REL of the REV attribute) that has the relation specified in the -relation parameter (all relations are modelled by libwww as "forward"). For example "-rellog stylesheets-logfile.txt -relation stylesheet" will produce a log file of all link relationships of type "stylesheet". The format of the log file is
" --> "

meaning that the from-URI has the forward relationship with to-URI.
-title [ file ]
Specifies a log file of URIs sorted after any title found either as an HTTP header or in the HTML.


References
Henrik Frystyk Nielsen, Webbot Command Line Syntax, Retrived in 04/05/1999
URL :
http://www.w3.org/Robot/User/CommandLine.html

Thursday, 6 October 2011

ATTENTION! Coming soon contest


Hello everyone, we would like to inform you that we will be having a Crossword Puzzle contest coming soon. All of the puzzle question will related to our tittle which is "Web Bot".
Rules and Regulation
Participants should follow the rules below:
  • The age below than 18 are not allow to register for this contest

  • This contest is no any area limited, but must in Malaysia

  • Fill in the form accordingly to complete the registration for the contest, and submit the form to J11008245@hotmai.com, technoint@hotmail.com or

  • Name:
    IC Number
    Gender
    Email address
    State
    City
    Contact Number
    Download the from from this linkshttps://docs.google.com/document/d/100-8uw02XMxjKHQsBALqvaFKeO43kNEV-GPgswSLl7I/edit?hl=en_GB
  • Each person only allow submit one entry, if entry has been found duplicate person or cheating will be disqualify

  • complete the Crossword Puzzle and submit it in .txt file via email to J11008245@hotmai.com, technoint@hotmail.com or


  • The prize will be take from those who submitted the answer first

  • The most correct answer only will get the prize


  • PRIZE
  • First prize : 4GB Pendrive x1

  • Second prize: External USB hub x1

  • Third prize: Optical mouse x1
  • Internet Bots

    An Internet Bot is a software application that does repetitive and automated tasks in the Internet that would otherwise take humans a long time to do. The most common Internet bots are the spider bots which are used for web server analyses and file data gathering. Bots are also used to provide the required higher response rate for some online services like online auctions and online gaming.

    Web interface programs like instants messaging and Internet chat relay applications can also be used by Internet bots to provide automated responses to customers. These Internet bots can be used to give weather updates, currency exchange rates, sports results, telephone numbers, etc. Examples are Jabberwacky of Yahoo Messenger or SmarterChild by AOL instant messenger. Moreover, Bots may be used as censors in chat rooms and forums.

    Today, bots are used in even more applications and are even available for home and business use. These new bots are based on a code called LAB code which uses the Artificial Intelligence Mark-up Language. Some sites like Lots-A-Bots and RunABot offer these types of services where one can send automated IMs, emails, replies, etc.


    URL: http://rield.com/faq/web-robots

    http://www.tech-faq.com/internet-bots.html

    Third Poll Result Analysis






    The tittle in the last week "Do you think that having Web Bot in your life is good?". We had created a pie chart for votes result.


    The votes result showed that there are total only 12 persons voted. There are 7 persons voted "Yes" and 3 persons voted "No". There 2 persons voted "I'm not sure" means having Web Bot in their life is good or not.




    In conclusion, according to the chart above, most of the people voted having Web Bot in their life is good. That's mean Web Bot is very helpful for a lot of people. Besides that, we will always improve our blog and share more information for all of you.



    back to top