Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Lots of Bots: 5 Types of Robots that Are Running the Internet

January 24, 2018 No Comments

By Duncan Argent, Independent Technology Author

Did you know that the internet is not run by humans, but by robots? According to research by security firm Imperva, bots account for 52% of all web traffic. The survey, which took into account almost 17 billion website visits ranging over roughly 100,000 domains, concluded that the bots have officially taken over the internet. And in good news, helper bots outnumber malicious bots, i.e. spambots and botnets responsible for launching DDoS attacks by 6%. Therefore, benevolent bots make up 29% of web traffic and harmful bots are responsible for 23%. So what exactly are the most prominent helper bots out there and what do they do?

1. Web Crawlers or Spiders

Also known as spider bots, these friendly and hardworking bots relentlessly roam the web for our convenience. Whenever a user turns to a search engine to surf the web and gets accurate results, it is all possible thanks to the web crawlers that the search engine employs. For example, Google relies on its very own crawler software, known as Googlebot. What these spiders do is browse web pages and index them, so that search engines can know what is in them and return accurate results to user queries.

itbr1

Source: Pexels

Search engines typically have numerous spiders at large simultaneously, crawling from one page to another and fetching content, so that they can generate a lot of feedback at the same time that will inform their ranking of websites in the search results they yield. What they do with the information they receive is build a large searchable index database, listing the words they encounter according to the respective algorithm. For example, when Googlebot crawls a specific website, it both looks for all significant words (e.g. omitting articles like “the”) and takes note of where on a page the words were found – for example, words in the title or tags are deemed more important.

2. Web Scraping Crawlers

These bots are essentially crawlers, but they are used for a fundamentally different function. Instead of helping with indexing, web crawlers in these cases assist in data scraping or data harvesting, which includes fetching content through crawling, adding it to a database, and then later extracting relevant bits of information for analysis. Although this process may sound a bit straightforward, it is actually extremely complicated and IT professionals are constantly developing new techniques and uses for data scraping. Web scraping software can be manipulated to look for specific content in specific places within a website, according to the purposes and goals of the developer who employs it.

And the uses for data scraping are countless: most prominently, it can be used to compare things, like product features, product reviews or price fluctuations – so it is very helpful for a lot of consumer-orientated software and apps. It can also be used in weather monitoring, in compiling lists of contact details from different websites, building “trending topics” lists and so on. In that respect, web scraping bots can be useful to a wide range of people, from web developers to academic researchers to advertising professionals. Most importantly, even less tech-savvy users benefit from the use of crawlers, even if they don’t employ them personally, as much of the information they consume online has been gathered through them.

3. Text-reading algorithms

Bots produce perhaps the most amazing results when they are employed to execute tasks that the human brain can accomplish too – but the speed, the volume and the accuracy of the bot is no match for its human counterpart. This is precisely the case with text-reading bots – elaborate algorithms that can browse text and analyse it according to specific keywords and their frequency within the text. Text-reading bots are for example employed to filter comments in social media or online news outlets, in order to flag and exclude specific types of comments (e.g. offensive or spam).

itbr2Source: Pexels

Sometimes, text-reading algorithms are taken one step further. Google Translate is one such example, thanks to which a user can point their smartphone or tablet camera to text in a foreign language, and then get a translation of it on their screen. This approach combines OCR, text-reading and translation algorithms to create meaningful output for the user. Algorithms and bots can be programmed to perform certain actions if they find a specific string of text in the text they read, and this approach gets more sophisticated with time. One development sees traders use trading robots which are programmed to make transactions depending on variations in keywords in specific news and/or market reports – or even on lack of specific keywords. Case in point, some trading bots are instructed to browse the semi-annual Federal Open Market Committee Statement issued by the Federal Reserve of the United States, which outlines the prospects for the economy, and then proceed to buy or sell (almost instantly) according to differences between the current Statement and previous ones. These differences usually include words that set the tone of the general monetary policy in the future – which means that transactions may be overly sensitive to very broad directions or even slight differences in wording.

4. Chatbots

Chatbots, also known as chatterbots or talkbots, are among the “good” bots out there that are the most recognisable by regular users – simply because they can (usually) realise when they interact with them. Chatbots are algorithms programmed to simulate a human interlocutor in order to conduct a conversation – with their capabilities tailored to the specific type of conversation for which they are employed. In that sense, chatbots normally pass the Turing test – a test devised in 1950 by the famous English computer scientist in order to assess artificial intelligence.

Chatbot technology nowadays is widely used in virtual assistant software, which is on the rise and expected to evolve even more in the next few years. Think Google Assistant and Apple’s Siri. Some specialised chatbots can take it to the next level, such as DoNotPay – which is described as a “robot lawyer”. DoNotPay managed to assist people in overturning parking fines in over 160,000 cases and now has moved on to refugee law, helping people with their immigration applications or asylum claims.

5. Video Game Bots

The internet is not only about information and transactions – it is about entertainment, too. This last type of bots we’re going to look at is all about helping out with having fun and making sure that gamers can get the best experience possible – but they are also slightly misunderstood. Game bots are AI software algorithms that can interact with a video game and take the place of a human player.

Although they have lately developed a bad reputation as they have been abused often by gamers in massively multiplayer online role-playing games to outsource boring, repetitive tasks such as “farming” or to accumulate experience without the player actually earning it, game bots are vital for the gaming experience when they are used as intended. For example, in most games, bots are used in development and demos, while players still learning the ropes, and they are also used in more advanced levels within multiplayer games to simulate other players – a useful function when all of your gamer friends are offline!

Just a few decades ago, when the internet was getting started, little could we imagine the size and versatility the world wide web would develop in such a short period of time; the internet nowadays can feel like an abyss that is impossible to navigate – and this is precisely why internet robots were developed. Possessing the capacity to process information in a speed impossible for the human brain, these programmed algorithms now form an integral part of any surfing experience.

Leave a Reply

(required)

(required)


ADVERTISEMENT

Gartner

WomeninTech