Categories
Business Internet News

Blind Date (More Unauthorized Online Experimenting)

blind-date

Following up from news a couple of weeks ago about Facebook manipulating its users, this week news abounds regarding a dating agency that has been conducting some experiments on its users.

The New York Times reports that the online dating agency OK Cupid has been manipulating the data it gives to its clients, to find out how compatibility and looks effect the dating process. The company conducted 3 different experiments, in one it hid profile pictures, in another, it hid profile text to see how it affected personality ratings, and in a third, it told some hopeful daters that they were a better or worse potential match with someone than the company’s software actually determined.

So as we might imagine they came up with a series of findings, that we could loosely interpret as the following:

1. If you are told that the person is more compatible you are more likely to contact them.

2. Users are likely to equate “looks” with “personality,” even in profiles that featured attractive photos and little if any substantive profile information

3. When the site obscured all profile photos one day, users engaged in more meaningful conversations, exchanged more contact details and responded to first messages more often. They got to know each other. But when pictures were reintroduced on the site, many of those conversations stopped cold.

Well as far as I can see number 1 is pretty self evident. If you send me a note saying that a person is not compatible then I probably won’t bother them with my personal issues., 2 is quite interesting, if I like the looks of someone I am more likely to think that they are an interesting person, may be fun and without doubt the perfect match for me. And also the third is quite obvious, if I don’t know what a person looks like I might imagine their looks and would be more likely to want to get to know them.

The OK Cupid blog will fill you in on the details.

One interesting line from the blog states that “guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work”. Wise words, but I wonder if everybody realizes that. And what power they wield!

Now I would like to raise the issue of how someone can design an algorithm to measure my compatibility with another person. What will make us more compatible? Height? Interests? Worldview (and if so how can you put that into numbers)?

There is an interesting book by Hubert Dreyfus called “What Computers Can’t Do”, and in it he argues that there are some areas and situations that cannot fully function. A computer program is based on expertise, on experience that can be categorized. If there are subject matters that are impossible to completely formalise, then they are impossible to formalize in computer programs (such as the one they use to find my perfect partner if they exist).

As a human I think we make decisions based upon generalizations of a situation. Characteristics are judged based upon experiences, I once knew someone with those characteristics and they were great, or stubborn, or nasty, etc. Research suggests that we play games such as chess in this way. We do not think about a long series of possible moves in the way a computer plays, but we see a situation, it reminds us of another situation that we have confronted in the past, and we act according to our experience of action in similar situations.

I am sure some readers have experience in this field, and I would be very happy to get some comments and expand my understanding.

Categories
Computers Internet Science

The size of the Internet – and the human brain

How many human brains would it take to store the Internet?

Last September I asked if the human brain were a hard drive how much data could it hold?

The human hard drive: the brainI concluded that approximately 300 exabytes (or 300 million terabytes) of data can be stored in the memory of the average person. Interesting stuff right?

Now I know how much computer data the human brain can potentially hold, I want to know how many people’s brains would be needed to store the Internet.

To do this I need to know how big the Internet is. That can’t be too hard to find out, right?

It sounds like a simple question, but it’s almost like asking how big is the Universe!

Eric Schmidt

In 2005, Executive chairman of Google, Eric Schmidt, famously wrote regarding the size of the Internet:

“A study that was done last year indicated roughly five million terabytes. How much is indexable, searchable today? Current estimate: about 170 terabytes.”

So in 2004, the Internet was estimated to be 5 exobytes (or 5,120,000,000,000,000,000 bytes).

The Journal Science

In early 2011, the journal Science calculated that the amount of data in the world in 2007 was equivalent to around 300 exabytes. That’s a lot of data, and most would have been stored in such a way that it was accessible via the Internet – whether publicly accessible or not.

So in 2007, the average memory capacity of just one person, could have stored all the virtual data in the world. Technology has some catching up to do. Mother Nature is walking all over it!

The Impossible Question

In 2013, the size of the Internet is unknown. Without mass global collaboration, I don’t think we will ever know how big it is. The problem is defining what is the Internet and what isn’t. Is a businesses intranet which is accessible from external locations (so an extranet) part of the Internet? Arguably yes, it is.

A graph of the internet
A map of the known and indexed Internet, developed by Ruslan Enikeev using Alexa rank

I could try and work out how many sites there are, and then times this by the average site size. However what’s the average size of a website? YouTube is petabytes in size, whilst my personal website is just kilobytes. How do you average that out?

Part of the graph of the internet
See the red circle? That is pointing at Technology Bloggers! Yes we are on the Internet map.

The Internet is now too big to try and quantify, so I can’t determine it’s size. My best chance is a rough estimate.

How Big Is The Internet?

What is the size of the Internet in 2013? Or to put it another way, how many bytes is the Internet? Well, if in 2004 Google had indexed around 170 terabytes of an estimated 500 million terabyte net, then it had indexed around 0.00000034% of the web at that time.

On Google’s how search works feature, the company boasts how their index is well over 100,000,000 gigabytes. That’s 100,000 terabytes or 100 petabytes. Assuming that Google is getting slightly better at finding and indexing things, and therefore has now indexed around 0.000001% of the web (meaning it’s indexed three times more of the web as a percentage than it had in 2004) then 0.000001% of the web would be 100 petabytes.

100 petabytes times 1,000,000 is equal to 100 zettabytes, meaning 1% of the net is equal to around 100 zettabytes. Times 100 zettabytes by 100 and you get 10 yottabytes, which is (by my calculations) equivalent to the size of the web.

So the Internet is 10 yottabytes! Or 10,000,000,000,000 (ten thousand billion) terabytes.

How Many People Would It Take Memorise The Internet?

If the web is equivalent to 10 yottabytes (or 10,000,000,000,000,000,000,000,000 bytes) and the memory capacity of a person is 0.0003 yottabytes, (0.3 zettabytes) then currently, in 2013, it would take around 33,333 people to store the Internet – in their heads.

A Human Internet

The population of earth is currently 7.09 billion. So if there was a human Internet, whereby all people on earth were connected, how much data could we all hold?

The calculation: 0.0003 yottabytes x 7,090,000,000 = 2,127,000 yottabytes.

A yottabyte is currently the biggest officially recognised unit of data, however the next step (which isn’t currently recognised) is a brontobyte. So if mankind was to max-out its memory, we could store 2,127 brontobytes of data.

I estimated the Internet would take up a tiny 0.00047% of humanities memory capacity.

The conclusion of my post on how much data the human brain can hold was that we won’t ever be able to technically match the amazing feats that nature has achieved. Have I changed my mind? Not really, no.

Categories
Computers Internet

File Sharing: Is Your Business Getting the Most Out of It?

Has your business ever reached a point where it’s run out of spare capacity for important files on its hard drives? If so, you might wonder what you could do to make sure it never happens again. The same might go for moving large files around, which can be fiddly at the best of times when email accounts cannot cope.

More room

A business can never have enough spare capacity for files. A growing number of businesses throughout the world have turned to cloud computing to help do both for a variety of reasons, which include:
Data servers

  • Being able to store files online in a ‘cloud’, an online space where they can be accessed securely.
  • Being able to share files from the cloud with clients and colleagues – collaboration is also possible.
  • Providing a viable alternative to a traditional server which is far more cost-effective.

While all this is help to make using the cloud palatable, there may be a possibility that businesses aren’t getting what they expect from some cloud storage providers, and that’s where enterprise cloud computing services like Egnyte come in.

Value for money

As with everything else they buy, businesses should make sure they get the most from their cloud storage and online file sharing package. There are a number of pitfalls facing companies who turn to the cloud for some of their IT solutions which they would do well to avoid.

The main one is the limits placed on the amount of file space you have to work with and the size of files which you can share. Many providers have limits in place, so it’s important to get as much space for as little money as possible. Also, consider what your business needs – how big are the files which you share and how much space do your files take up?

As this article states, the way in which your file sharing vendor affects your internet connection is also important. If you have several file sharing accounts, they could slow your internet speed down, so take that into consideration before choosing the right service provider.

Safety first

Another factor that should influence choice of a cloud file sharing provider is the security of their services. Most providers have security software which limits opportunities for accounts to get hacked, while a few have taken extra steps to make users’ accounts practically impervious to even the most sophisticated malware.

Categories
Social Media

What Not To Share on Social Media

The point of social media is sharing, along with openness and at least trying to be yourself over the internet. While there are a lot of things worth sharing and airing to the world, there are some things that are best unsaid – or in this case un-tweeted, un-Facebooked, and just kept to yourself.

Photos of credit cards or other financials

You might be thinking “nobody is stupid enough to do that,” but the truth is, there are people who have already done it. Some people have posted photos of their credit cards – account numbers and all, leading to some nasty comments. Clearly, this is not a wise thing to do. Others post photos of bills, leaving their names and addresses unblurred. This is a big risk that can easily be avoided. You are nullifying a section of Facebook’s Community Standards that state:

“We take the safety of our members seriously and work to prevent attempts to compromise their privacy or security, including those that use fraud or deception. Additionally, we ask that you respect our members by not contacting them for commercial purposes without their consent.”

Pranks

If you post a link that is seemingly interesting, make sure it really does lead to a worthwhile page. Otherwise, you are just wasting people’s time. Rickrolling, where linking people to a YouTube video of Rick Astley singing “Never Gonna Give You Up” was very popular, is now an annoyance. Show some maturity. This may be in violation of this section from Facebook:

“Before sharing content on Facebook, please be sure you have the right to do so. We ask that you respect copyrights, trademarks, and other legal rights.”

As well as Twitter, from their Twitter Rules:

Copyright: We will respond to clear and complete notices of alleged copyright infringement. Our copyright procedures are set forth in the Terms of Service.”

It was said that Rick Astley asked the video to be taken down. When you Rickroll, you are committing a violation.

Vague updates

If you are being vague, you are most likely asking people for attention. You want them to ask you what it is about but the truth is, nobody really cares about your vague status updates.

Crass photographs – of yourself, no less

We are not all blessed with bodies of Greek gods and goddesses so it might be in everyone’s best interests to avoid uploading that self-portrait you took when you were fresh out of the shower. Besides, the terms of Facebook say that:

“Facebook has a strict policy against the sharing of pornographic content and any explicitly sexual content where a minor is involved. We also impose limitations on the display of nudity. We aspire to respect people’s right to share content of personal importance, whether those are photos of a sculpture like Michelangelo’s David or family photos of a child breastfeeding.”

Your contact details or anyone else’s

Your phone number is a very sacred thing that should only be given out to people you know and trust. There are lots of people on the internet that will take great pleasure in making your life miserable if you happen to post your contact details on any social media websites.

Social media privacy

Your address, photos of your home, and vacation dates

These are all a combination of ways to say “I will be gone on these days but hey, look where I live and see the nice things that will be left unattended”, which, in a nutshell, is an open invitation for people with less than noble intentions.

Threats and bullying

There is nothing worse than a bully who does their dirty work online. It is also a clear violation of Twitter and Facebook’s policies:

“Safety is Facebook’s top priority. We remove content and may escalate to law enforcement when we perceive a genuine risk of physical harm, or a direct threat to public safety. You may not credibly threaten others, or organize acts of real-world violence. Organizations with a record of terrorist or violent criminal activity are not allowed to maintain a presence on our site. We also prohibit promoting, planning or celebrating any of your actions if they have, or could, result in financial harm to others, including theft and vandalism.”

“Facebook does not tolerate bullying or harassment. We allow users to speak freely on matters and people of public interest, but take action on all reports of abusive behavior directed at private individuals. Repeatedly targeting other users with unwanted friend requests or messages is a form of harassment.”

“Facebook does not permit hate speech, but distinguishes between serious and humorous speech. While we encourage you to challenge ideas, institutions, events, and practices, we do not permit individuals or groups to attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.”

Violence and Threats: You may not publish or post direct, specific threats of violence against others.”

Rules are there for a reason and following them will make social media sites a better place for everyone involved.

Categories
News Technology

2012 Proves to be Award-Winning Year for Oracle

This is a sponsored post. To find out more about sponsored content on Technology Bloggers, please visit our Privacy Policy.

This year is off to a good start for Oracle, as the company won several prestigious awards which put Oracle at the top of their game. Can the business software provider continue its success throughout 2013?

Oracle's LogoV3 readers awarded Oracle as Best Business Intelligence Vendor at the V3 Technology Awards 2012. After receiving the award, an interview with Nick Whitehead, senior director for business intelligence (BI) at Oracle UK, highlighted some relevant areas for Oracle and IT professionals as a whole regarding how 2013 looks for the ICT industry. Mr Whitehead discussed the systems developed by Oracle in the past year and how BI is affecting decisions being made and for the future.

Mr Whitehead highlighted that 90% of the world’s data was created since the beginning of 2011 with expectations for this data volume to multiply 50 times by 2020. He emphasised this by adding:

“Often it is talked about as a problem for IT – how do you capture and store all that new information? There’s lots of it, it’s arriving fast and there’s lots of variety. Our customers are starting to ask ‘how do we get business value from all that data we’ve never captured or analysed before.’ I think that’s a better question. Value is realised with analytics. We want to help our customers find answers with business analytics. In every industry we’re seeing answers emerging, new business models where using all this new data is imperative. Once the business case is clear and understood, we can help them solve the IT problem with good architecture and engineered systems to allow them to acquire and organise it in a cost effective way”

Advancements like these help many understand why IT has continued to be a thriving sector compared to other business sectors across the globe. The technological possibilities in answering these raised questions also mean for those in oracle jobs, forthcoming year is set to be an interesting and rewarding time.

Oracle’s success continued with being recognised as the global market leader in customer care systems and in overall market share by revenue. Leading industry analyst company Analysys Mason published the report in January 2013 based on 2011 shares.

Mark Mortensen, principal analyst at Analysys Mason, commented on the announcement saying:

“Customer care systems are increasingly important to communications service providers as they work to gain competitive advantage and reduce customer churn. ‘All-in-one’ CRM systems, such as Oracle’s offering, help CSPs reduce costs and grow revenue by enhancing self-service channels, improving cross-channel sales and marketing efforts and improving business analytics”

Yet another success came with Oracle being placed in the Leaders Quadrant by Gartner Inc in its 2012 “Magic Quadrant for Integrated Revenue and Customer Management for CSPs” report. This is based on the high level of vision and ability within the company.

Widely recognised as the leader in the field, Oracle seems to going from strength to strength, will 2013 be another award-winning year for Oracle and their workers?

Categories
Computers Environment Internet Media

Data Storage Problems

This week the New York Times published a long article about the problem of data storage, and I would like to summarize some of their findings. The article is available here in Saturday’s technology section.

The article is an attack at what the author sees as wasteful use of resources in data storage centres. There are now hundreds of thousands of these huge centres spread throughout the world, and the problem is they use an incredible amount of electricity. The servers have to be kept cool and they have to have spare capacity so that we can download whatever we want whenever we want.

Inside a US data centre
Inside a US data centre

Worldwide these centres use about 30 billion watts of electricity, and that is about 30 nuclear power plants worth of power. A single data center uses about the same amount as a small town, and the main criticism is the nature of the usage.

In the US 2% of all electricity used goes to these data centers, but the vast majority of this resource is wasted. Typically many servers are left to run 24 a day but never or rarely used (more than half in this study), and the average machine in operation uses less than 10% of its capacity. Servers are left running obsolete programs or in ‘comatose’ because nobody wants to risk a mistake and turn them off.

All of this means that any data center might use 30 times as much electricity as is needed to carry out the functions it performs.

All of these centres also have to have a back up in case of power failure, and so are surrounded by diesel generators and stacks of batteries, and many have been found in breach of environmental regulations and fined. The article gives details but the companies are names that we all know and use.

If you read the more than 300 comments however you will discover that a lot of people do not agree with the findings as reported. Many technicians argue that the companies cited are investing huge amounts of money into making the storage of data more efficient, and are constructing wind farms and using solar power in an attempt to cut costs and emissions. The article has its agenda and exploits it fully, but the problem is real.

I personally believe that we are witnessing the results of a digital culture change. We no longer have to store data on our machines, we can store it in some mythical cloud out there in the cyber-universe. This makes us think that it somehow exists without the need for a hard drive, but this is not true. As a result we keep things that we do not need. I have 500 e mails in my inbox, with attachments, photos that I will never again look at and other useless things, and they are all in storage somewhere.

Technology advances, storage gets cheaper and uses less space, but the amount of data created is growing at an incredible rate. My question is, can we do anything about it? Are we not the ones who should take some responsibility and think about the consequences of our actions. We think about not using paper to print emails but we don’t think about not sending them!

Categories
Computers Internet

Data centers – where would we be without them?

It is hardly a controversial opinion to suggest that data centers are an indispensable part of modern life – it really is impossible to envisage a world without the vast capacity and power that data centers grant the business and information technology sectors – where would we all be without Google, eBay and Facebook? But what exactly are the advantages of a data center that their absence would remove? Let’s take a look…

Firstly, the cost to enterprises of running an in-house own data center is increasingly prohibitive, given the growing size requirements of such facilities. As more and more information is transmitted online and more transactions take place online – just as computing power continues to grow – then a data center needs exponentially more power and more hard disk space. Server racks just keep on getting longer and even keeping them cool enough becomes uneconomically costly for most small, medium or even large-sized enterprises.

Server RoomSecurity is the second big issue – in a world where cyber-crime and espionage is constantly growing, there needs to be a very robust response to security. Most enterprises just don’t have the time and resources to police their IT operations to the level required, and a good data center can give them the reassurances they need. Technicians can be on hand throughout the 24-hour cycle, not just to ensure that the server racks keep on functioning perfectly and that any problems are swiftly dealt with, but also to ensure that information is kept safe and secure and that hacking or phishing attempts are roundly thwarted.

So, where would we be without data centers? The answer is that we would have a vastly more limited cyber-world, with businesses forced to keep their computer operations artificially scaled back due to cost considerations, with the knock-on effect of a hobbled e-commerce sector. Social networks would be slow, unsafe and prone to disastrous infiltration, while search engines would also be grindingly slow and frustrating to use – welcome back to the late 1990s.

It’s safe to say that data centers are not only here to stay, but will keep getting bigger and better so long as the computer world leads the way.

Categories
Computers Media Technology

How the USB revolutionised computing

This is a sponsored post. To find out more about sponsored content on Technology Bloggers, please visit our Privacy Policy.

The Universal Serial Bus, or as it is now commonly referred to as the USB, is a port designed to provide power supply or share data between electronic devices.

Ask someone to think about a USB, most people will naturally assume you are talking about a memory stick, which in essence is a super small, lightweight portable hard disk. However don’t confuse a USB port (the holding device) with a USB flash drive (a memory stick).

The USB (both port and flash drive) is something most of us take for granted in modern times, so I thought in this post it would be interesting to look at some of the uses for USBs, and how the USB has evolved over time.

USB 1.0

Design prototyping for the USB began when computing was still in its infancy, way back 18 years ago in 1994. At the time the port was being developed by the big players in the computing industry – Microsoft, Compaq, Intel, IBM, etc. These companies realised that there was (at the time) no easy medium which allowed communication through computers. For the computer to evolve, the companies realised that this would be an integral part of the system, as if you cannot share data, options are limited. Do remember this was happening in times before the internet was the global phenomenon it is today.

The first USB was produced in 1995 by Intel. Computers of the time started to come fitted with one or two USB 1.0 ports – although looking back, relatively few PCs were ever released with USB 1.0 ports. Nowadays, USB ports are in most cases a necessity for keyboard and mouse input devices.

The USB 1.0 was a revolutionary product, however looking back, its functionality was limited. Its maximum data transfer speed was 12 megabits per second. Relatively slow. That said, back when it was first introduced, a computers internal hard drive was typically only sized around 256/1024 mb (1/4 of a gigabyte to one gigabyte).

USB 2.0

In late 2000, the USB flash drive was released, enabling users to store more data than ever before, by storing things external to their computer. It would be an understatement to say that the USB flash drive was a step up from the floppy disk – it was more of a leap up! Initially, most USBs were typically 8 megabytes in size, meaning that they could hold more than five times what a floppy disk could.

Earlier in the year, the USB 2.0 was released, meaning that data transfer could happen 40 times faster, at 480 megabits per second. Initially some flash drives were designed for 1.0, however soon they were all being designed for the new 2.0 port, due to the increased possibilities.

USB 3.0

In 2008, the currently less well known USB 3.0 was released, which is more than ten times faster than its 2.0 brother.

USB flash drives have also improved over the years, and it is now possible to get a USB flash drive that is 256 gigabytes – one quarter of a terabyte. These disks are bigger than most computer hard drives were just a few years ago, showing the extent of the upgrades this technology has undergone.

A 256 gigabyte memory stick would though be useless with a USB 1.0 port as filling it would take almost 2 days (1.98 days) due to the speed of the data transfer. Even with a USB 2.0 port, the data transfer would take almost 72 minutes – more than an hour. Modern USB 3.0 ports could have the job done in less than 7 minutes. That really shows the true scale or achievement and advancement made in the USB industry.

Modern Uses

The USB is a crucial component of the modern PC, and is also very important for other devices. It is now possible to power many smartphones and multimedia devices via USB, either through a plug or your computer.

Some people use USB sticks to carry around a portable operating system with them, as it is perfectly possible to load Windows 7 onto a 16 gb memory stick and carry it around with you.

A USB penThe USB itself is a very flexible (not literally, the board would probably snap were you to bend it) device, with a lot of room for aesthetic variation. You can now get a range of Promotional USB Sticks, which many organisations often utilise, choosing to offer branded USBs as promotional gifts. This is all thanks to the readily available technology and cheap price of the components involved.

USBs now come shaped as credit cards, keys, pens, robots, people and even wine bottles!

A USB shaped into a bottelDo you have any funky USB flash drives at home? How about USB ports, have you counted how many your PC has? Comments and feedback below as always 🙂

Categories
Business

Is remote working the future for business?

In previous articles I have talked about how technology and business interact, and what the future of technology might hold for the world of business. In this article I am going to explore the idea that in the future, almost all business will be conducted remotely.

The number of people who permanently work at home in the UK, (known as teleworking) was estimated to be 1.3 million in 2010. The working population of the UK at the same time is believed to be around 30 million, therefore around 4.5% of the UK’s population (in 2010) were teleworkers. That said, is is estimated at the same time that 3.7 million UK workers sometimes worked from home, and sometimes went into their place of work. That means that of the working population, around 12.5% were, at some point, working remotely.

The figures are similar for the USA, and other developed nations. More and more firms around the world, are offering their employees the opportunity of working from home, but why?

Cost Advantages

Many people do not realise it, but it is often much cheaper to give employees access to the technology they need to work at home, than it is is to provide them with a workstation in an office unit. Yes that might mean you need to buy every employee a laptop, printer and make sure they have an internet connection, however that is often much cheaper than maintaining a workstation, in a fixed location.

If employees work in an office, then the firm either has to purchase or rent the premises – this can be very costly. Furthermore, an (often very expensive) IT mainframe system needs to be in place, to ensure that the entire building is connected internally, and with the outside world – including offices in other locations. Most employees will need a computer to work at, so why not buy them a laptop, give them their own printer, make sure they are internet connected, and tell them to work from home? It is often much cheaper.

Technological Advancements

Improvements in technology mean that working from home is more viable than ever before. Thanks to online storage systems, which allow simple, easy and effective file sharing among workers, employees are able to connect with each other, and share data from almost anywhere in the world. Outsourcing such tasks is often a much cheaper option for firms, than maintaining their own expensive IT infrastructure.

Advancements in communication technologies have also improved the viability of teleworking. I have previously wrote about QB Robots, robots which are effectively your eyes and ears in the office, which you can remotely control, whilst you are not in the office. These sort of devices mean that you can still connect with other workers, almost as if you were there in the room with them.

Anybots QB Robot
The head of one of Anybots QB robots – notice the webcam eyes and screen inbuilt into the head – such robots can improve the potential for remote working

You don’t necessarily need a QB robot to stay in communication with others though. Technologies like webcams, and VoIP mean that it is really easy to stay in contact, and in the loop, so you are just as up to date, as you would be, were you in the office.

Service Improvement Through Better Access

Technology has made it easier to work remotely, and it is often cheaper, but another advantage of teleworking, and a reason which I believe will be one which causes further growth in the industry, is the improvements in accessibility that teleworking offers.

In his recent article ‘Five changes in video conferencing for the next decade‘ Rashed wrote about how improvements in connectivity could improve the prospects for services like telemedicine. Being able to connect to people remotely, means that those who live/work in more remote areas, are more likely to be able to become connected.

Improvements in Productivity

Many studies have shown that working from home can actually boost productivity and reduce the time employees take off ill.

British Telecom claims that its teleworkers save it an average of £6,000 per year (per worker) due to the reduction in the costs of having to provide a workstation, the reduction in commuting costs, and through the increases in productivity. BT claims that its teleworkers are 20% more productive and take fewer sick days. This is probably due to the reduced stress associated with working at home, due to employees not needing to deal with the hassles of commuting, and the occasional hassles presented by co-workers, arguments and misunderstandings can cause stress!

In addition to this, the less time employees spend commuting, the more time they have to themselves, and the more time they can spend working. Say an employee spends an hour and a half commuting each day (two 45 minute journeys) then they could spend an  get an extra 45 minutes working, and get an extra 45 minutes to themselves.

In Summary

To conclude, working remotely is often a much cheaper option for both firms and employees, it has been made more viable thanks to technological improvements, it can improve the services that a firm can offer, and also improve the productivity of the workforce. These are some of the reasons, why I believe teleworking will become much more common in the future.

Categories
Business

How important is the quality of hosting to online retailers?

This is a sponsored post. To find out more about sponsored content on Technology Bloggers, please visit our Privacy Policy.

With many brick-and-mortar businesses adding an online version of their high-street store to their portfolio, it’s important that firms choose the right web hosting service. With myriad services offering cheap deals, firms ought to be wary regarding offers that appear to be too good to be true – because they usually are.

On the surface, purchasing web hosting that costs £20 a month seems like a steal. In fact, it is a steal. However the only thing that is being nicked is precious uptime for online retailers, as the vast majority of cheap hosts go hand-in-hand with downtime.

Downtime – a retailer’s worst nightmare

For online retailers, downtime is especially important; every second of downtime is potentially a lost sale. Would you rather pay a premium for quality web hosting that is reliable and constantly up, or pay a third of the price for web hosting that keeps going down? In the long-term, it may cost more for firms to pay for cheap, but less reliable hosting.

In addition, utilising the services of a web hosting service in your time zone could be important, especially for smaller firms. Imagine if your store goes down but your hosting service is half-way across the world. This is certainly not ideal for any stores looking to make sales. For example, let’s just say your UK-based store goes down at lunchtime. No amount of calls at 12pm is going to wake a firm located half-way across the world; tucked up in bed at 12am. It’s a nightmare scenario.

SEO

The importance of SEO over the last 10 years has changed the face of the internet. An increasing number of online retailers are producing fresh content in a bid to become a publishing authority in the eyes of search engines.

However, when a cheap hosting company offers dead links, 404 errors and other harmful downtime to a retailer, what are these search engines going to think? Bounced traffic isn’t going to look good in the eyes of Google or Bing.

Technology Bloggers 404 error
Technology Bloggers 404 Page

Eventually, a retailer could slip down the rankings, and get flanked by its competition. It takes a lot of dedication to work up the search rankings, so don’t let a bogus hosting firm ruin your company and its prospects.

Security

In addition, security should be a top priority for online retailers. The amount of hackers roaming cyberspace is vast and make no mistake – they’re ready to capitalise on unprotected websites. By opting for secure web hosting which features STFP and SSL, a business and its clients can feel assured that all sensitive data is kept in safe hands.

As you can see, the quality of web hosting is an absolutely integral part of the foundations of success for online retailers. In an era of cost cutting and tight purse strings, it might be tempting to lump with a cheap web hosting service from the other side of the world, but in the long-term you may end up opening your wallet more often than you think.