Monday, 31 October 2011
Chat Bot
Friday, 28 October 2011
Web Bot Contest Answer is Released!!
Thursday, 27 October 2011
Fifth Poll result Analysis
The title of the poll last week is “Do you guys know what is Malware Bots?” We had created a pie for votes results.
The above figure showed that there are only total 2 votes. 1 person voted “Yes” and the other voted “I don’t know what is that.”
Conclusion, according to the chart above, only 1 person voted that he know about Malware Bots. Which means that there is people know what is Malware Bots. Lastly, we will continue to improve our blog and share more interesting information to the readers.
Web Scraping & techniques
Techniques for Web scraping
Web scraping is the process of automatically collecting Web information. Web scraping is a field with active developments sharing a common goal with the semantic web vision, an ambitious initiative that still requires breakthroughs in text processing, semantic understanding, artificial intelligence and human-computer interactions. Web scraping, instead, favors practical solutions based on existing technologies that are often entirely ad hoc. Therefore, there are different levels of automation that existing Web-scraping technologies can provide:
1.) Human copy-and-paste: Sometimes even the best Web-scraping technology cannot replace a human’s manual examination and copy-and-paste, and sometimes this may be the only workable solution when the websites for scraping explicitly set up barriers to prevent machine automation.
2.) Text grepping and regular expression matching: A simple yet powerful approach to extract information from Web pages can be based on the UNIX grep command or regular expression matching facilities of programming languages (for instance Perl or Python).
3.) HTTP programming: Static and dynamic Web pages can be retrieved by posting HTTP requests to the remote Web server using socket programming.
4.) DOM parsing: By embedding a full-fledged Web browser, such as the Internet Explorer or the Mozilla Web browser control, programs can retrieve the dynamic contents generated by client side scripts. These Web browser controls also parse Web pages into a DOM tree, based on which programs can retrieve parts of the Web pages.
5.) HTML parsers: Some semi-structured data query languages, such as XQuery and the HTQL, can be used to parse HTML pages and to retrieve and transform Web content.
6.) Web-scraping software: There are many Web-scraping software tools available that can be used to customize Web-scraping solutions. These software may attempt to automatically recognize the data structure of a page or provide a Web recording interface that removes the necessity to manually write Web-scraping code, or some scripting functions that can be used to extract and transform Web content, and database interfaces that can store the scraped data in local databases.
7.) Vertical aggregation platforms: There are several companies that have developed vertical specific harvesting platforms. These platforms create and monitor a multitude of “bots” for specific verticals with no man-in-the-loop, and no work related to a specific target site. The preparation involves establishing the knowledge base for the entire vertical and then the platform creates the bots automatically. The platforms robustness is measured by the quality of the information it retrieves (usually number of fields) and its scalability (how quick it can scale up to hundreds or thousands of sites). This scalability is mostly used to target the Long Tail of sites that common aggregators find complicated or too labor intensive to harvest content from.
8.) Semantic annotation recognizing: The Web pages may embrace metadata or semantic markups/annotations which can be made use of to locate specific data snippets. If the annotations are embedded in the pages, as Microformat does, this technique can be viewed as a special case of DOM parsing. In another case, the annotations, organized into a semantic layer, are stored and managed separately from the Web pages, so the Web scrapers can retrieve data schema and instructions from this layer before scraping the pages.
The above example are the techniques of Web scraping.
http://www.mozenda.com/web-scraping
SenseBot
Clever Bot
Wednesday, 19 October 2011
Web Bot-Webscraper
There is some really great webscraper software now on the market. Webscraper software can be an invaluable tool in the building of a new business and in any endeavor requiring extensive research. The new generation of programs incorporates an easy to use GUI with well-defined options and features. It has been estimated that what normally takes 20 man-days can now be performed by these programs in only 3 hours. With a reduction in manpower, costs are trimmed and project timelines are moved up. The webscraper programs also eliminate human error that is so common in such a repetitive task. In the long run, the hundreds of thousands of dollars saved are worth the initial investment.
Video of a screen scraper software, Competitor Data Harvest:
http://www.youtube.com/watch?v=uU91vOsS6xc&feature=player_embedded
Extract Data from the website
Data extraction from a site can be done with extraction software and it is very easy. What does data extraction mean? This is the process of pulling information from a specific internet source, assembling it and then parsing that information so that it can effectively be presented in a useful form. With a good data extracting tool the process will take a short time and very easily. Anyone can do this not to mention the simplicity it comes in when looking to extract and store data to publish it elsewhere. The software to extract data from the website is being used by numerous people, with amazing results. Information of all types can be readily harvested.
Free Data Mining Software
There is now a lot of free data mining software available for download on the internet. This software automates the laborious task of web research. Instead of using the search engines and click click clicking through the pages tiring and straining the eyes then using the archaic copy and paste into a second application, it can all be set to run as we relax and watch television. Data mining used to be an expensive proposition that only the biggest of businesses could afford but now there is free data mining software that individuals with basic needs can use to satisfaction. Many people swear by the free programs and have found no need to go further.
Create data feeds
Creating data feeds especially the RSS is the process of distributing the web content in an online form for easy distribution of information.RSS has enabled the distribution of content from the internet globally. Any type of information located in the web can be created into a data feed whether it is for a blog or a news release. The best thing about using a web scraper for these purposes is to ensure that your information is easily captured, formatted and syndicated. Cartoonists and writers in the newspaper business create data feeds for their work to be disseminated to readers. This process which has enhanced the sharing of information to many people has been made possible.
Deep Web Searching
The deep web is the part of the Internet that you can not see. It can not be found by traditional search engines and bots to find all the data and information. Deep web searching needs to be done by programs that know specifically how to find this information and extract it. Information found in the deep web are pdf, word, excel, and power point documents that are used to create a web page. Having access to this information can be very valuable to business owners and law enforcement, since much of it is all information that the rest of the public can not access.
Credits to:
http://www.mozenda.com/web-bot
Tuesday, 18 October 2011
Crossword puzzle
Good Luck!!!
Across
- A search engine spider
- other name of web crawler
- retrieves information from websites
- A software application that does repetitive and automated tasks in the Internet
- Send spam emails and typically work by automated
- using the Inktomi database
Down
- A special type bot that are use to check prices
- Other name of web crawler
- Crawling, Indexing and Serving is the key processes in delivering search result.
Like any program, a bot can be used for good or for evil. "Evil" web crawlers scan web pages for email addresses,
instant messaging handles and other identifying information that can be used for spamming purposes. Crawlers can also engage in
click fraud, or the process of visiting an advertising link for the purpose of driving up a pay-per-click advertiser's cost. Spambots send
hundreds of unsolicited emails and instant messages per second. Malware bots can even be programmed to play games and collect prizes from some websites.
Malware is type of bot which allows an attacker to gain complete control over
the affected computer.Computers that are infected with a 'bot' are generally referred to as 'zombies'.
There are literally tens of thousands of computers on the Internet which are infected with some type of 'bot' and don't even realize it.
Attackers are able to access and activate them to help execute DoS attacks against web sites, host phishing attack Web sites or send out thousands of spam email messages. Should anyone trace the attack back to its source, they will find an unwitting victim rather than the true attacker.
A dos can say as denial of service. malware bots that are used for injecting viruses or malware in websites, or scanning them looking for security vulnerabilities to exploit are most dangerous.
It’s no surprise that blogs and ecommerce sites that become target of these practices end up being hacked and injected with porn or Viagra links. All these undesirable bots tend to look for information that is normally off limits and in most situations, they completely disregard robots.txt commands.
Dectecting infection
The flood of communications in and out of your PC helps antimalware apps detect a known bot. "Sadly, the lack of antivirus alerts isn't an
indicator of a clean PC," says Nazario. "Antivirus software simply can't keep up with the number of threats. It's frustrating [that] we don't
have significantly better solutions for the average home user, more widely deployed."Even if your PC antivirus check comes out
clean, be vigilant. Microsoft provides a free Malicious Software Removal Tool. One version of the tool, available from both Microsoft
Update and Windows Update, is updated monthly; it runs in the background on the second Tuesday of each month and reports to Microsoft whenever it finds and removes an infection.
Reference
Amber Viescas, What are web bots[online], Retrieved in 25 july, 2001
URL:
http://www.ehow.com/info_8783848_bots.htmlg
Tony Bradley, What is a bot?[online]
URL:
http://netsecurity.about.com/od/frequentlyaskedquestions/qt/pr_bot.htm
Augusto Ellacuriaga, Practical Solutions for blocking spa, bot and scrapers, Retrieved in 25 june,2008
URL:
Review of HotBot
- HotBot (which is actually a Yahoo!/Inktomi database, and the version reviewed here)
- Ask Jeeves (the Teoma databases
- Advanced searching capabilities
- Quick check of three major databases
- Advanced search help
- Does not include all advanced features of each of the four databases
- No cached copies of pages
- Only displays a few hits from each domain with no access to the rest in Inktomi
Friday, 14 October 2011
IBM WebFountain
The project represents one of the first comprehensive attempts to catalog and interpret the unstructured data of the Web in a continuous fashion. To this end its supporting researchers at IBM have investigated new systems for the precise retrieval of subsets of the information on the Web, real-time trend analysis, and meta-level analysis of the available information of the Web.
Factiva, an information retrieval company owned by Dow Jones and Reuters, licensed WebFountain in September 2003, and has been building software which utilizes the WebFountain engine to gauge corporate reputation. Factiva reportedly offers yearly subscriptions to the service for $200,000. Factiva has since decided to explore other technologies, and has severed its relationship with WebFountain.
WebFountain is developed at IBM's Almaden research campus in the Bay Area of California.
IBM has developed software, called UIMA for Unstructured Information Management Architecture, that can be used for analysis of unstructured information. It can perhaps help perform trend analysis across documents, determine the theme and gist of documents, allow fuzzy searches on unstructured documents
http://www.redbooks.ibm.com/abstracts/redp3937.html
http://news.cnet.com/IBM-sets-out-to-make-sense-of-the-Web/2100-1032_3-5153627.html
Facts on the Spambot
Fourth Poll Analysis
Thursday, 13 October 2011
Shopbot
- Easy navigation
- Well-organized results
- Sorting by total cost (including shipping, handling, taxes, and restocking fees)
- Prices from a large number of retailers
Friday, 7 October 2011
Web Bot: 1.2 billion dead in BP oil spill, Nov. 2010 nuclear war. Accurate? Will ETs intervene?
Webbot Command Line Syntax
- References
- Henrik Frystyk Nielsen, Webbot Command Line Syntax, Retrived in 04/05/1999
- URL : http://www.w3.org/Robot/User/CommandLine.html
Thursday, 6 October 2011
ATTENTION! Coming soon contest
Hello everyone, we would like to inform you that we will be having a Crossword Puzzle contest coming soon. All of the puzzle question will related to our tittle which is "Web Bot".
Participants should follow the rules below:
Name: | |
IC Number | |
Gender | |
Email address | |
State | |
City | |
Contact Number |
PRIZE
Internet Bots
An Internet Bot is a software application that does repetitive and automated tasks in the Internet that would otherwise take humans a long time to do. The most common Internet bots are the spider bots which are used for web server analyses and file data gathering. Bots are also used to provide the required higher response rate for some online services like online auctions and online gaming.
Web interface programs like instants messaging and Internet chat relay applications can also be used by Internet bots to provide automated responses to customers. These Internet bots can be used to give weather updates, currency exchange rates, sports results, telephone numbers, etc. Examples are Jabberwacky of Yahoo Messenger or SmarterChild by AOL instant messenger. Moreover, Bots may be used as censors in chat rooms and forums.
Today, bots are used in even more applications and are even available for home and business use. These new bots are based on a code called LAB code which uses the Artificial Intelligence Mark-up Language. Some sites like Lots-A-Bots and RunABot offer these types of services where one can send automated IMs, emails, replies, etc.