Greenpixie provides cloud emissions data to help companies take control of their carbon footprint, reduce spending, and lower their emissions.
We spoke to the CEO of the company, former Infotex employee John Ridd, to find out a bit more about his journey, cloud computing emissions, and Greenpixie’s mission to help reduce them.
It would have been about two years ago from ideation when me and my co-founder, Will, CTO of the company, had the idea during a hackathon. We had heard about digital emissions making up over 2% of global emissions, which was more than that of the entire aviation industry.
We felt that there was a real business opportunity to quantify this issue and perhaps reduce it using the scalability of digital and the availability of data for this particular sustainability issue.
We created an MVP (Minimum Viable Product) and first focused on websites: we came up with a way to quickly estimate the emissions of our website homepage, which we sent out via email marketing and got a lot of responses – we thought, we’re onto something here! That started our journey into digital sustainability. It just so happened we didn’t end up doing websites, but cloud computing emissions, however the principles that we started with remained the same: a scalable way to measure and reduce digital emissions.
By 2030, digital emissions are predicted to be over 10% of global emissions due to continuing demand for data. A large amount of these digital emissions is cloud computing, but also the internet itself; Chat GPT, streaming services – it all ends up with data centres, which suck up an incredible amount of electricity. A lot of metals and rare minerals are used to create all this hardware server equipment and continually replenish it – I think the average lifespan of a server in a data centre for cloud computing is only about four years. A recent Telegraph article refers to data centres outside of London as ‘energy vampires’ because of their electricity usage impacting residential build plans.
It’s a combination of those two things, the huge amount of electricity used, and the materials and minerals needed. There’s also a third dimension, which is water usage. Data centres take in a colossal amount of water in order to keep things cooled. One example is a data centre in Nevada, which was recently covered in BBC’s Panorama programme a couple of months ago, Is the Cloud Killing the Planet?, which used one billion litres of water a year.
That gives you an idea of how our digital world and cloud computing actually causes a very big sustainability issue.
This problem is really a business and enterprise problem. It’s not going to change from us at home not watching Netflix so much, or reducing our own digital emissions, because that’s a drop in the ocean compared to where the problem lies, which is with businesses and enterprise. So we focus on cloud computing.
Software engineers in companies essentially rent servers from data centres owned by Amazon, Google and Microsoft, such as Microsoft Azure or AWS. Hundreds or even thousands of tons of emissions are created from renting all these servers. Currently, there’s no real way for companies to have reliable cloud computing emissions data. This is due to cloud providers not being transparent enough to ultimately provide this information in a way that is really compliant with regulation due to multi-cloud setups within companies.
Secondly, the big factor, which we focus on, is there’s a large amount of cloud waste in the way that enterprises utilise the cloud. Software engineers over purchase server space when they are building out the tech corporations for companies. This means that 30% of servers that are being turned on, left on and used, don’t need to be. So there’s 30% more emissions which also correlates with a 30% overspend, and the companies are paying.
What GreenPixie does is provide companies with the data needed to comply to the latest regulations. One of the big ones is CSDR coming through the EU, which makes companies responsible for their cloud computing emissions, and falls under Scope 3 (the supply chain) in the ESG framework. Secondly, we enable software engineers to reduce emissions at source: we quantify the emissions that are being wasted and then they can clean up the waste through the way that they’re building out their tech, saving money as they do that.
I would say when it comes to digital emissions, they’re very actionable and very measurable when the right tools are out there. I’m actually very optimistic that we can reduce this ‘10% or more’ figure by 2030.
The truth is doing it for the good of the planet normally doesn’t fly when it comes to business priorities, because everyone has so many other priorities. Leadership teams in companies often need to have successful sustainability initiatives of reducing at source so there needs to be cost incentives, which we provide. We’re optimistic in our own focus that this is going to be adopted and there are going to be a lot of emissions prevented.
As a wider sustainability lens, there’s been a move away from offsetting, because really what that is in practice is companies outsourcing their responsibility. There’s been some high profile greenwashing legislation coming through, which it’s going to stop companies getting away with that. As long as companies are clever in trying to incentivize the enterprises to reduce their emissions, we can get there. Cloud is a really good example of that, I think.
We covered cloud emissions at COP 27 for the first time. And we have a really amazing advisory board that opens up opportunities like this.
There is responsibility when it comes to putting out the best information, but there’s a real network of businesses in industry sustainability at this point. So there’s institutions, such as the SDIA (Sustainable Digital Infrastructure Alliance) who are focusing a lot on the science behind this. There is also open source tooling, Cloud Carbon Footprint, which we’ve built upon and improved to be able to give emissions data. We’re connected into all the information to allow us to be leaders in this space. It’s also become more mainstream now, on BBC One a couple of months ago cloud computing emissions were covered as a top level issue.
There is awareness now, but we were one of the first to build a business around this.
We’re fundraising in order to bring this to market in a very big way, and we’ve already got this data product that brings transparency over the issue and allows software engineers to start acting on it.
So really we want to go down the route of growing out the team. We’ve got six full time currently, but we want to get a lot bigger than that now we’ve brought a product to scale. So hopefully we can be servicing hundreds of company’s cloud emissions in the near future, making enough revenue ourselves and ride the wave of what we’ve built.
Yeah, I have a lot to be thankful for for my time at Infotex. I actually did an interview fairly recently and I spoke about Infotex and about Tim Webster (My First Boss).
Infotex is a very giving company that enables companies like Greenpixie to emerge because of people like Tim. Infotex is generally very generous with their time and has this tech mindset that allows companies like this to be built.
The General Data Protection Regulation came into effect on 25th May 2018. Its goal is to protect the rights of individuals where personal data is being used. It does this by outlining the rights of individuals, requiring a lawful basis for the processing of personal data, and placing expectations on how personal data is managed.
In practice, it can be time-consuming and difficult to adhere to GDPR and it is not always clear how specific scenarios should be interpreted. This sometimes creates a laissez-faire attitude towards it where it can be treated as a box-ticking exercise.
At its heart though GDPR, when implemented correctly, benefits us all. Fundamentally it is about understanding the breadth of the personal data that you are capturing and reviewing its journey from the individual through your business-controlled systems and processes and oftentimes its onward journey to third parties.
Without this process, it’s very easy to fall into a situation where you are capturing personal data without even realising it. Keep in mind that GDPR takes a fairly broad approach to what constitutes personal data (including IP addresses and cookies where they can be utilised to identify an individual). Producing a modern website is complex and this complexity increases all the time. Whilst sometimes functionality is programmed from scratch for a specific website, there are also numerous choices when determining which 3rd party services to utilise or integrate with. Take something simple like an integration with a page-sharing service. Perhaps it utilises some embeddable code to render the icons and facilitate sharing. Innocent enough on the surface but is the script capturing any data? Is any of the data personal? Where is it sent? What happens to that data? How do we request its access/deletion?
Another common example is a contact form to capture a simple message (perhaps a name, contact number, and message). This seems relatively straightforward but there are a number of questions we should be asking such as, do we need all of that data? Where is the message being sent? Is it stored anywhere? Does it get sent via email to an email client? Is that being downloaded and stored? Is it passed on to other departments within your company or travel onward to other 3rd party systems? Would you be able to recover/delete that data if requested?
It’s important to state that GDPR does not stop you from doing these things. It asks you to consider whether you need to do them and, if you do, that you do so responsibly and transparently without infringing on the rights of individuals.
From a business perspective, there is great value in understanding your data:
We owe it to ourselves to carefully consider how we capture, process, and share personal data. We shouldn’t just implement a new service, integration, or tool without first looking at it through a data protection lens. A useful exercise is to consider whether you would be comfortable with your own personal data being processed in that way.
As a final thought, we are entering a world where AI is going to be a part of our everyday lives. AI systems require input in order to respond (either in the form of a question or, as another example, the context around something being analysed such as a piece of code). This input has the very real possibility to contain personal or sensitive data. Where does that personal data go? Who is it shared with? Can it easily be recalled/deleted? These questions do not have easy answers and it remains to be seen how AI will be regulated to provide the same protections currently offered by GDPR.
If you need help understanding your data please get in touch.
For those of us who live in the Microsoft world, .NET has been revolutionary over the last 20 years.
It enables developers to collaborate easily and write, test and fix code efficiently, as well as supporting builds for a range of different applications, including desktop apps, mobile apps, and website apps.
Our .NET applications are developed to sit behind the websites, processing data, integrating with other platforms and providing back office user functionality that is so important to make websites work effectively on the front end.
Microsoft .NET is an open source developer platform that assists the creation of various types of applications.
.NET has two main different versions:
.NET Framework is the original version that Microsoft introduced back in 2002. It was a replacement for Visual Basic, which was used by many developers to build applications for Windows. Around this time was when Internet technology was really taking off, with many organisations building not only websites but Intranets as well.
Alongside .NET, Microsoft also introduced Visual Studio .NET, which was an Integrated Development Environment (IDE). This enabled developers to develop software code, build this code, and ultimately run it on their computers to debug before putting it live. This IDE software is updated every couple of years.
The last version of .NET Framework 4.8 was released in April 2019, with an updated security version 4.8.1 in August 2022.
Microsoft has now retired the .NET Framework and moved across to what was known as .NET Core before this became .NET 5 in November 2020. Through this cross-platform version, Microsoft has expanded its platform to support .NET developers looking to leverage other platforms, and to attract customers looking to build apps with other tools including Node, PHP, and Java.
As with most things Microsoft, there are continual improvements and new versions being released each year. .NET 7 was released in November 2022 with .NET 8 due to release in November 2023.
What this means for us, as developers, is there is a continuous learning process to keep up to date and keep our software as up to date as possible, as unfortunately Microsoft doesn’t support their .NET versions for very long. .NET 7 will stop being supported in May 2024.
For our customers, it means that we have to focus on keeping applications up to date, and occasionally having to rewrite these from scratch to be able to utilise new functionality and keep applications secure.
Microsoft Azure is a cloud computing platform which launched in February 2010. It is a collection of integrated services for building, deploying and hosting applications and services through a global network of Microsoft managed data centres.
The idea behind cloud computing is that it stops organisations having to have their own data centres or collection of physical servers, which both need to be managed and are very energy inefficient. For example, data centres have to be air conditioned so that the servers don’t overheat.
It is the large-scale equivalent of your Google Drive or Microsoft OneDrive, which stores your files on big servers that you access via an internet connection, instead of on your own computer’s hard drive.
Cloud software runs on a remote server belonging to the company who makes or operates that software, and when you want to use it you access your account online.
During the 1980s – 2000s, Microsoft’s Windows system was the go-to operating system, enabling home PC users and businesses alike to interact with their computers. But with the cloud computing revolution of the late 2000s, competitors like Amazon Web Services (launched in 2006) introduced online services for developers to make new websites and complex applications from one basic framework.
To prevent being left behind, Microsoft launched Azure, a cloud platform for .NET and other developers to interact with. Microsoft has now opened up the Azure environment, adding support for non-Microsoft technologies in order to widen its appeal to all kinds of developers. Microsoft has also built a large number of technologies specifically for the Azure platform.
Compared to on-premises and some traditional hosting providers, Microsoft Azure can help increase efficiency and reduce costs. It is reliable, offering platform uptime guarantees of 99.95% and can be coupled with multi-region failover to further increase reliability.
Microsoft Azure offers a variety of services, including virtual machines, databases, storage, networking, analytics, artificial intelligence & Internet of Things (IoT). Infotex use many of the Azure services as part of our technical toolkit. For example, “WebApps” is used to power some of our web applications coupled to Azure’s cloud based SQL Server database.
If you would like to discuss what would be most suitable for your needs, speak to one of our experts today .
We talk about security a lot in the articles, but we talk about it even more internally as it’s vital we maintain safe and secure sites for our clients.
The threat intelligence team at WordPress security experts Wordfence have recently released their annual report on the state of WordPress’ security. As hosts of many WordPress sites we have to understand the ever changing landscape in which these sites exist so we can combat likely intrusion points..
The key take-aways from this year’s report were:
That sounds bad – more vulnerabilities means more problems? Not quite. There has been an increase in companies who are CVE Numbering Authorities. CVE stands for “common vulnerabilities and exposures”, and is a publicly available catalogue of known security flaws. Historically many WordPress issues were not reported but because of the increase in the number and openness of these authorities, it’s made it much simpler for people to officially disclose security problems.
As WordPress, and the majority of its plugins, are built within the open source ecosystem, anyone can download the code and analyse it. The more people who are looking, the more issues the more are likely to be found. Finding and reporting such issues is increasingly becoming a full-time (paid) occupation for many developers who are then paid through the “bug bounty” programs. These ensure that the bugs don’t end up in the hands of malicious entities therefore these being responsibility reported helps everyone within the ecosystem.
Out of the vulnerabilities reported the most common issue was Cross-Site Scripting (XSS), with over 1,100 reports in 2022 alone this accounted for nearly half of all vulnerabilities disclosed. Cross-Site Scripting attacks are a type of injection, in which malicious scripts are injected into otherwise benign and trusted websites. More than a third of the XSS issues required administrative permissions on the website itself in order to be successful, so the risk was greatly reduced. This does highlight why users should only be provided with the minimum level of access they absolutely need, WordPress has a strong permissions architecture with varying roles with Contributor, Author, Editor, Shop Manager and Administrator being the most common each with different abilities.
Despite the number of reported XSS vulnerabilities there were around 3 times as many SQL injection attacks as there were XSS. A SQL injection attack is when an attacker tries to run database commands through a website which has not taken care to sanitise what people are entering into forms etc. There was also a comparable number of malicious file upload or inclusion attacks, these might be where someone gains access to the administration area and uploads a script to gain them further access rather than the intended image or text.
There are more and more leaked password lists available online as more data breaches occur. Credential stuffing is where hackers are utilising usernames and passwords taken from these lists to try and log in to the admin area of your site. When directories like HaveIBeenPwned (enter your email to see what leaks your info has been a part of) over 12 billion compromised sets of credentials it is no wonder that Wordfence collectively blocked over 159 billion login attempts in 2022. Surprisingly, this is actually a slight decrease on 2021.
To keep your site safe please don’t use your WordPress login details on any other site, and make sure that when you create the password it is rated as Strong. These are good principals to apply to any site, and combining it with multi-factor authentication wherever it’s available will make it even more secure.
It has always been important to make sure that the core WordPress code, its plugins, and themes are kept up to date with the latest patches and it is no less true now. It’s obviously good practice for keeping your site secure, but also for extending its overall lifespan. Trying to upgrade very out of date plugins or WordPress code is very time consuming.
WordFence saw that most attacks targeting specific vulnerabilities were via known, and easily exploitable, flaws in this code on sites that had not received any recent updates. Infotex will take care of your site and make sure that it’s got the latest patches to keep things running smoothly. Indeed Wordfence stated “As such, the greatest threat to WordPress security in 2022 was neglect in all its forms“.
The second largest category of attacks was from known malicious “User Agents”. A “User Agent” is the formal term for a browser but also encompasses many other ways in which website content is processed. From Infotex’s own data around 60% of all requests are typically non-human in nature.
In addition to the more legitimate search engine robots (aka “bots”), many of these requests are from bot’s that have no purpose on a site or system other than to attack it.
A common task for these bots is looking for webshells – this is where an attacker has gained a foothold within a website and is intended to allow them to retain control and request the server to act on the attackers behalf, it’s commonplace for attackers to compete for access to webshells as nefarious access to one can cause huge problems for the site’s owner. Wordfence saw over 23 billion attacks of this type in 2022 across the 4 million+ sites they protect.
The full report is available to download via Wordfence.com
There has been a lot of noise in the media over the last month over the rapid rise of AI tools such as ChatGPT, Google Bard, and Microsoft Bing’s AI enhanced search. AI is nothing new, but ChatGPT reached 1 million users in less than a week and 100 million in under two months.
Basically it’s a chatbot. The tool lets you provide a natural-language prompt or question, and then ChatGPT responds back in natural-sounding language. The bot will use the previous questions / prompts to assist in responding to future questions on the same thread. Surprisingly (or perhaps to avoid the SkyNet of Terminator films), the bot doesn’t use the internet for its response – it’s solely based on the huge data set it has been trained on.
You can request it to answer questions or be creative by writing a poem on a specific topic. Many are using it to write covering letters for job applications, solve maths problems with a step-by-step breakdown of the answer, and write code that goes into websites.
At Infotex we polled the team as to how they’d been using it. So far they have:
ChatGPT isn’t foolproof though, and even ChatGPT’s owners OpenAI note “It’s a mistake to be relying on it for anything important right now. We have lots of work to do on robustness and truthfulness.”
ChatGPT’s rapid rise has prompted Google to expedite their new AI-powered search feature ‘Bard’. This uses natural language processing and machine learning to provide more relevant and insightful information to users.
I don’t expect it to replace the traditional ‘10 blue link results’ in Google but do expect the top section of the results pages to begin including AI-generated responses and answers to questions/queries.
This could be game-changing for search in many ways:
One downside of these generated answers would be fewer clicks on the organic results, which would be frustrating for website owners looking for traffic. As it’s Google, I’d expect to see them protecting their more commercial search terms which would currently be occupied by paid ads. It wouldn’t be too difficult to keep commercial queries (“car insurance”) and general questions (“what is Newton’s third law”) separate.
At the same time, Microsoft is also adding AI-generated answers to its Bing search results. Again, adding AI-generated responses to the search page results.
These are just a few examples of the many applications and uses of ChatGPT. Its versatility and ability to understand and generate human-like responses make it a valuable tool for a wide range of industries and use cases.
Everyone has their own idea of what the future will look like… but in the fast-moving digital world, the future is never far away. We’ve asked some of our team what their predictions are for the coming year. Do you agree?
The word ‘metaverse’ was runner up for the Oxford University Press Word of the Year 2022 – a telling sign of the growing conversation around a future where digital and physical worlds merge. Contributing to this future is the growing traction of Augmented Reality (AR), a type of Extended Reality (XR) that is on the rise along with Virtual Reality (VR) and others.
Note, for instance, the new World Cup FIFA+ Stadium Experience, an augmented reality overlay that allows stadium audiences to view stats, heatmaps, insights, and VAR replays on their phones while they watch the match live. This is just one of many examples of AR, a technology that brings together digital data and the physical world and is predicted to reach a global market of $50 billion by 2024. While technology is usually implemented on mobile apps – such as Amazon’s View in Your Room feature or the Ikea Place app – it is starting to be implemented on websites too, such as knitted tie store Broni and Bo’s virtual try on. AR might prove to be particularly beneficial to business owners from sectors such as beauty, manufacturing and tourism.
Amidst the online constant buzz of activity, brands and platforms alike are battling to create meaningful and memorable user experiences. Motion is one of the ways that your brand’s website can stand out and hold on to user attention. Implemented well, a user experience including motion can communicate a story, sequence, or transition more effectively than one without.
Interfaces that include motion do not have to rely on plugins but can be integrated through development frameworks. Enhancing your website and brand through Motion UI doesn’t have to mean animation or videos – additions as simple as the motion micro-interactions that occur when a user hovers over an action point or clicks a transition button can make the difference between a static website and one that ignites a user’s interest. Take a look at the Motion UI on our own website, for example.
As the internet grows and changes, the popularity of voice search continues to rise through Amazon Alexa style devices and “Hey Siri” requests. And, with the increasing popularity of Internet of Things (IoT) devices, such as smart speakers, this trend doesn’t show any signs of decline. This is not a trend to ignore: optimising your business for voice search will help with every aspect of your overall SEO. Click here to find out more about what steps to take for voice search optimisation.
You may not realise, but many websites that you visit are actually using PWA technologies to provide an experience closer to that of native applications. You can see this when you visit sites like Twitter, Gmail, etc.
Progressive web apps are essentially web applications that feel and function like a native mobile application. This means they increase the quality of user-experience by offering advantages such as offline use, hardware access, push notifications, and the ability to be “installed” on the user’s device. While these clever web apps have been on the increase for a while, their popularity shows no sign of slowing down. Click here to read more about progressive web apps.
Single Page Applications (SPAs) are a key cause of our constant scrolling habits. SPAs work inside a browser to offer seamless user-experience by dynamically loading as a single page. This way the user does not have to wait for the site to continually reload, and can enjoy uninterrupted scrolling. They can offer better page performance, data protection, and work efficiently if the user has poor internet connection, as the content loads completely at the first sign of communication with the server.
Single-page websites (SPWs) work in much the same way. Website content, such as that which might otherwise be found under a “Work” or “About” tab, is fully loaded on the initial page and can be navigated by links within the one page. These intuitive and well-structured single-page websites increase the likelihood of maintaining the attention of users, and enable control of the order in which information is absorbed. Compared to multi-page sites, the site design and development requires less time and money and is more suited to optimisation for mobile devices.
Smart Content refers to the dynamic elements of your website that change depending on the site user profile. It targets individual customers with a personalised experience, and also decreases site loading times to drive significantly higher conversion rates and ROI. Read more about Smart content loading in our blog.
The green transition is here, and, with the internet as a major producer of carbon emissions, web developers have an important role to play. From sustainable web design, to efficient web development, to green hosting, there are many things website creators can and should be doing where possible. As awareness grows about the need for online business that cares for people and planet, and creative solutions increase, we expect sustainable web development practices to continue to grow.
One of the less sexy elements, but for any company that has ever experienced a cyber attack first-hand, online security has always been extremely important.
During 2022 we saw an increase in large-scale nation-state cyber-attacks, such as the Russian attacks against Ukraine and Montenegro and the unidentified attack on the New Zealand government. In 2023 businesses should expect attacks of this kind and scale to become more common and sophisticated. Some of the more pessimistic members of our team would not be surprised to a government body or key public services (or comparable body) is brought down due to a cyber attack.
These security concerns are not just reserved for large corporations. In 2022 research by the world economic forum found that 95% of cyber security issues were caused by human error or a lack of cyber security concerns. Website and Web applications process a lot of valuable data, and with more company assets moving to the cloud to accommodate hybrid/remote working, the potential damage caused by cyber-attacks has never been higher.
Many of the trends we see for 2023 are very similar to those we saw in 2022. Will 2023 be the year that web3 finally kicks off? or the year that there is a considerable push on hardware that supports ARs, making it an essential part of our daily lives? Only time will tell!
One thing is certain. However, companies that provide clients and customers with the best user experience will thrive in 2023. There is a lot of new exciting technology out there that is easy to get excited about, but there is no magic bean this year that will separate the pack. Companies that take the time to understand their customers and demographics and tailor their website and online marketing to utilise the above tools (motion UI, smart content, PWAs, AR etc) correctly will come out on top.
Digital security is a necessity in an age where attacks and data exfiltration are commonplace. Hosting and managing hundreds of websites and systems also means handling a lot of valuable information. Keeping that data safe is a responsibility we take very seriously.
Cyber Essentials Plus is a UK Government-backed scheme designed to demonstrate organisations’ resilience against cyber attack. It ensures our systems are up-to-date, secure and fit for purpose, meaning our clients can rest assured that they are working with a business that is confident in its digital security.
The standard Cyber Essentials certification covers these five main areas:
As part of the Plus version of the certification, Infotex underwent an independent external technical audit by URM Consulting, to ensure that necessary technical controls are in place for the security of our systems. A random sample of staff were selected to be audited – making sure their work environment is up to date and secured. Our in-house infrastructure team periodically review all devices, to ensure they are all configured correctly. By passing, we are proving our internal processes, policies and security controls are in line with National Cyber Security Centre (NCSC) standards
Having previously completed Cyber Essentials Plus, the biggest change for this year is that all cloud services admin accounts offering multi-factor authentication must now have that enabled. In fact Infotex have gone one further and enabled it on all cloud services where that is feasible. Alongside this, minimum password length has been increased for any accounts, something reflecting the increasingly hostile online environment where password cracking tech continues to improve. We have also now disabled that stalwart browser of the last two decades Internet Explorer on all our Windows devices to bring that chapter of the web to a close.
Much like a car MOT, Cyber Essentials Plus is the minimum that we work to. We go above and beyond this with regular reminders and training, both face-to-face and virtual being provided to all Infotex staff to keep security in mind with both our practices, device configurations and website development processes to make sure we are doing all we can to maintain our ongoing cyber security knowing that forms a part of our clients also.
If you are looking at your businesses cyber security then undertaking Cyber Essentials Plus is something that we’d thoroughly recommend. It is a way to focus the company on the aspects which will give you the greatest security benefit against the attacks which are ongoing in the real world as the NCSC evolves the standard every year based upon the attack data that they witness in the real world.
It is estimated that data centres contribute 2% of all global greenhouse gas emissions – a figure that is rising as digital demand increases. However, by utilising cloud-based services for our hosting we are sharing resources and facilities, which reduces the number of duplicate, energy-hungry single-use servers.
We are conscious that site hosting will have an impact on Infotex’s carbon footprint. Because of this we are always looking to make sure our technical partners have, or are, taking steps towards sustainability. Our monitoring systems also help us to ensure that we are using these resources efficiently.
For the hosting of our primary websites and systems we use three main providers: Rackspace, Amazon Web Services (AWS) and ionmart.
Rackspace’s approach to the environment is straight-forward: they aspire to give back more than they take from the planet.
In 2019, Rackspace reviewed its energy strategy and opted to focus resources and efforts on energy reduction instead of purchasing carbon offsets.
Rackspace’s UK data centres LON3 and LON5 run on 100% renewable energy. Data centre LON8 does not, though Rackspace publishes an Environmental, Social and Governance Report (2021) showing steps they are taking to be net-zero across all sites by 2045.
Their commitment to a greener business isn’t just limited to energy. They have a host of creative ways to minimise waste in offices, such as composting coffee grounds and shipping pallets, refurbishing retired IT equipment for aftermarket use, collecting HVAC condensate to maintain landscaping and operate cooling towers.
As part of their route to net zero, they have been publishing a greenhouse gas emissions inventory every year since 2008, covering their global operations.
For further details visit Rackspace’s Corporate Responsibility section of their site.
Amazon Web Services (AWS) is targeting their global operations to be powered by renewable energy by 2025. The London and Ireland based AWS (where we host our sites and systems) are currently powered by 95% renewable energy.
In 2019 Amazon launched the UK’s largest wind Corporate Power Purchase Agreement, located in Kintyre Peninsula, Scotland. The new wind farm is expected to produce 168,000 MWh of clean energy annually – enough to power 46,000 UK homes every year.
Amazon provides a Customer Carbon Footprint Tool which allows us to monitor our own carbon emissions and how those would compare to running on-premise computing equivalents – cloud computing can be 80% more efficient in this respect.
For further details visit Amazon’s Sustainability in the Cloud section of their site.
All of iomart’s data centres are powered by 100% renewable energy. They continuously evaluate sites to continue to reduce emissions, such as looking at how waste heat can be turned back into usable power. This project won them the ‘Best Use of Emerging Technology’ from the Digital City Awards in March 2022.
In 2022 iomart developed a Carbon Roadmap to help understand their Scope 1 and 2 GHG emissions, and set carbon reduction targets. They also comply with ISO50001 Energy Management to reduce energy usage.
Further details can be found on iomart’s Environmental, Social & Governance page.
This is a topic which started over 10 years ago and is led by the USA’s Cybersecurity & Infrastructure Security Agency (CISA) and is shared with the European Cyber Security Month (ESCM).
While the topic may seem ethereal and mired in complicated titles, the principle behind it is very simple and one which every business should take time this month to consider if you haven’t already.
October is a month when many businesses start to focus on the busy period ahead and getting the basics in place before that rush could save you valuable time later on so here are some thoughts and actionable tips.
Cyber Security starts with the simplest of things, which hopefully everyone reading this knows and implements already:
Infotex have gone through the accreditation process, and while we had a good security understanding beforehand this has helped focus everyone’s attention on the issue.
Phishing is when a fraudulent email is sent to you asking you to take some action believing the email originated from someone else you know. This is one of the biggest threats to any organisation today with almost a quarter of breaches in the Verizon Data Breach Report 2022 started via a phishing attack.
It is believed that around 3% of all phishing emails successfully entice their viewer to click the link. The emails are often very convincing using a combination of familiarity, based on information colleagues have posted about themselves online (sometimes unwittingly), and also a sense of urgency. It is always worth taking that moment to check because clicking a fraudulent link could be the start of a chain of events you’ll never forget.
Phishing doesn’t just happen via email. Text messages and phone calls are also becoming more common targets for phishing attackers as awareness of email phishing rises.
Ransomware is designed to prevent you from getting access to the files on your computer by encrypting them. You are then invited to pay a ransom to unlock the files.
It is generally recommended not to pay ransoms as you can’t be sure that the attacker will fulfil their side of the deal. You’re also funding organised crime and encouraging future attacks. It is better to invest in good protection and well-protected, external backups that are not directly connected to any device. Ensuring your computing devices and programs are up-to-date and have good antivirus software installed costs very little but offers a lot of protection, also maintain a good policy on keeping the operating system and software patches up to date, such as Windows Updates, finally if you run as a limited user rather than an administrator that often reduces the damage an attacker can inflict.
Within Cyber Security the term “capture the flag” is an exercise whereby one team set out to obtain some item of data held by another team within the business. If they are able to obtain it then both teams stop, learn how it happened and agree on steps that can be taken to ensure that a genuine attacker could not do so, thus increasing the overall security of the organisation.
You don’t need formal “red & blue teams” to do this, even the smallest of businesses can benefit from trying this, perhaps start by seeing whether one staff member can find the login password (or passphrase) for another member of staff’s computer. is it on a post-it attached to their monitor, is it the name of their child / cat / favourite holiday destination? Do they leave their PC logged in while they take their lunch break allowing anyone to walk up-to and use the PC in their absence?
The aim of Capture The Flag is not to belittle anyone but rather for everyone to learn from the experience and collectively improve your defences.
These are just a few of our thoughts, there’s much more advice available online as well as events in both the virtual and physical world but now you’ve read this article do ask yourself whether even that advice is genuine or is someone trying to get information out of you?
Cookies are a well-known topic of concern for internet data security. Yet we find ourselves interacting with them every day – mindlessly accepting the cookie banners on websites we visit as we go about browning the internet. Does it matter?
Here’s everything you need to know about the pros and cons of cookies and how to be mindful of them.
Cookies are small snippets of data created by websites when you visit and browse them. They were first invented in the mid-1990s by a developer for the browser Netscape, as a way to inform the browser if a user had previously visited a particular website.
Cookies sometimes provide essential roles for websites, such as by remembering the items saved in your shopping basket on an ecommerce website until you check out.
Other times, cookies are used by advertising companies to retain data about your browsing habits and target ads to you across your browser. Ever wondered how you are targeted over and over with ads for something you once viewed?
The uses of cookies can be categorised into three broad purposes:
Functional, whereby cookies inform the server of past website activity by this specific user. For instance, when you log in to a site, a cookie maintains your shopping basket as you jump between pages.
Personalisation means that cookies help a browser remember the activity or preferences of a user. When the user revisits the website, the experience can be tailored to them (such as by remembering your chosen light/dark colour scheme).
Tracking cookies record user activity to be used for advertising or analytics purposes either to show information customised to you or to present that information back on behalf of the site’s owner.
These store user information during one specific site visit, and are deleted either when the browser is closed or after a period of inactivity. Commonly these are used to store confirmation of whether you are logged in or not.
These come directly from the website you are visiting, and the information contained is restricted to that site. They will remain in your browser between visits, for example when you click “remember me” on a login panel to show your email when you return.
These are generally malign provided the website you are browsing is trustworthy and uncompromised. To aid this, the site owner can indeed mark these to only be accessible over a secure connection by their web server and not by scripts running in your browser.
Third-party cookies are those that come from companies external to the website you are browsing, one such example is an image served by an advertiser, these are often used to track your behaviour, providing targeted ads to multiple sites you visit and they can have long lifespans of a year or more. One of the most common third-party cookies on the web is Google Analytics.
Known by several names such as Zombie Cookies or EverCookies. These use combinations including all of the above and more such as browser “local storage” or specially crafted cache entries to recreate user information and tracking profiles even when regular cookies have been cleared from your browser. These are almost always used to track user behaviour such as for advertising purposes and can be extremely difficult to fully remove.
Generally, cookies are safe. They can only store a limited amount of data and unlike programmed information, cookies cannot easily be hacked or used to install viruses on a computer. However, an insecure cookie – one that is communicated unencrypted or intercepted via third-party scripting on a site – can be a potential security risk for visitors or operators of the origin website. With cookies providing simple information, though, the risk is rarely of high concern.
Instead, the concern most associated with cookies is the privacy of personal data and tracking.
Cookies can be used to allow advertisers to store information about your browsing habits to provide targeted ads that follow you around the web.
But, supposedly, this cannot happen without you knowing about it… laws such as the GDPR, the ePrivacy Directive and the Data Protection Act 2018 mean that operators of sites using cookies have to ask for your informed consent to gather data except where that data is needed for the core site functionality. That’s why there are so many banners online now, asking for your agreement. But often we just click “accept all” without thinking twice…
While cookies are generally safe, it is a good idea to know that it is not difficult to control them.
Your browsers preferences or settings will allow you to :
Many browsers will also let you browse in private or ‘incognito’ mode, prohibiting your browsing history or cookies to be stored or indeed allow you to clear the cookies on a per-site basis.
Some modern privacy centric browsers now offer ‘state partitioning’ – a fancy way of assigning third-party cookies to the site you were viewing when they were set. That way the adverts on a site remain with that site rather than follow you around the web despite the tracking companies best efforts to do so.
Browser manufacturers know that third-party cookies have obtained a poor reputation due to the tracking mis-use outlined above. In 2021 Google announced that their market leading Chrome browser will cease support for third-party cookies in 2024. They are however also piloting new technologies to replace it called FLoC and its successor Topics. These are intended to be ways for advertisers to obtain a generic profile of the site viewer which is shared with many other individuals worldwide, allowing relevant adverts to be shown based upon the type of site viewed recently; typically they last 3 weeks, while not allowing the advertisers to identify the viewer individually.
Google’s recently introduced Analytics product GA4 is specifically designed so that it can be event based and work without cookies, unlike previous versions.
We are delighted to announce Infotex have been accepted into the Crown Commercial Digital Outcomes 6 framework, which will be live later this year.
Crown Commercial Service supports the public sector to achieve maximum commercial value when procuring goods and services.
Acceptance onto the framework allows local government and healthcare organisations access to services provided by Infotex. Our ambition is to work more closely with a wider range of organisations in order to design, build, improve and support the back-end systems that sit within healthcare and government to produce better outcomes for all.
Frameworks are agreements between the government and suppliers to supply certain types of services under specific terms. Infotex Ltd have been accepted to provide:
As a digital outcomes supplier, we must:
Jonathan Smith, Director of Infotex Healthcare Systems commented “We are delighted to be accepted onto the framework. It gives us greater opportunity to support the NHS and wider services using our experience in the development of the systems we are already delivering into the care sector”.
“This additional platform reflects the hard work and dedication of our team to really deliver systems in the right way, to the right audience. We can continue to support healthcare teams and patients on the path to better digital assessment and care which is so important.”
Most recently, the team launched a digital self referral platform that allows the smooth and carefully managed assessment of podiatry patients which decreased our client’s 800+ patient backlog to manageable levels within just a few weeks.
Take a look at a review by Dr Hinkes of this system.
In 2019/20, CCS helped the public sector to achieve commercial benefits worth over £1bn – supporting world-class public services that offer best value for taxpayers.
For further information about Infotex’s health systems get in touch.
Discover how our team can help you on your journey.Talk to us today