How to | Quanta https://www.quanta.io Web Performance Management for Business, Uninterrupted. Thu, 03 Sep 2020 09:43:57 +0000 en-US hourly 1 Loading time, Speed Index, Score Page Speed… which indicator better represents the User Experience? https://www.quanta.io/blog/conversion/indicator-user-experience/ Mon, 05 Mar 2018 16:10:19 +0000 https://quanta.io/?p=15441 Web Performance covers the speed and health of your business funnels. How fast pages will load is an extremely important matter for the overall User Experience on a digital interface. Thus, in the case of an Ecommerce store it will impact its conversion rate and business results. Therefore, it should be closely monitored! But then, […]

L’article Loading time, Speed Index, Score Page Speed… which indicator better represents the User Experience? est apparu en premier sur Quanta.

]]>
Web Performance covers the speed and health of your business funnels. How fast pages will load is an extremely important matter for the overall User Experience on a digital interface. Thus, in the case of an Ecommerce store it will impact its conversion rate and business results.
Therefore, it should be closely monitored! But then, which metrics should you follow? Loading time, Speed ​​Index, Page Speed Score… many key performance indicators are available. How to use them and which ones better represent the User Experience?

Here is a little tour of the main Web Performance KPIs that should catch your attention.

Loading time: a basic metric, but a little outdated

Loading time has been the first performance indicator used. It simply represents the overall loading time of a page (until the last pixel is displayed).
It is interesting to follow this indicator but it does not represent the customer experience. The main reason for that is that a page is usually displayed progressively and become interactive before being fully loaded, which makes the user feel like the page was loaded before the total loading time was reached.

Speed ​​Index: a weighted measure of loading time

The Speed ​​Index is a measure of the display time that takes into account the actual user experience. It is expressed in milliseconds. A low score means a fast display and therefore a better user experience. Concretely, the speed Index represents the average display time of all pixels above the bottom of the screen.

Let’s take an example:

Page A is displayed progressively (30% of pixels at 0.5 seconds, 70% at 1 second and 100% at 1.5 seconds) vs page B which is displayed in one time (0 to 100% at 1.5 seconds).

% visually complete graph of page A vs page B

Page A and Page B will have the same loading time (1.5 seconds). However, the user experience will be much better on page A.
In the case of page A, as the page is displayed gradually, it has a double positive effect for the user:

  • He has a visual confirmation that the page is loading: it helps him to wait and keeps him focus on the page.
  • He can already use some of the page contents: the loading time is no longer “lost”.

The Speed ​​Index calculation will take this difference into account to favor the progressive display. Thus, Speed ​​Index page A will be 1000ms when the one of page B will be 1500ms.

What is a good Speed ​​Index?
It is commonly accepted that after a 1 second wait, customers will lose their train of thought and their purchase intent. After 3 seconds, more than half of them will leave the site out of frustration. Therefore, a good Speed ​​Index should ideally be (and stay!) under 1000ms. Between 1000 in 3000ms, the navigation remains “acceptable” but fragile. And above 3000ms it is absolutely necessary to improve it as the risk of losing visitors if becomes very high.

​​Page Speed Score: a best practice compliance indicator

The ​​Page Speed score is a very different indicator. It does not represent the loading time, but Google’s best practice recommendations.
This score is a global grade out of 100 and made up with the compliance grades of each different best practices.

indicateur Score Page Speed

Thus, Google Page Speed ​​is not an indicator representing the user experience, but rather a tool that will highlight optimization opportunities. It is very complementary to other KPIs as following your PageSpeed score will help you build the optimization roadmap to ultimately improve your Speed Index and User Experience.

As a conclusion, there are many indicators to track the Web Performance of your pages. But in the end, what you need to follow the big picture which is “how is my User Experience?”. To answer this question, the Speed ​​Index is the reference. On the other hand, once you have identified which pages need to be improved, you need to know what to do next to be able to improve, and this is when the ​​Page Speed score makes the most sense.

 

L’article Loading time, Speed Index, Score Page Speed… which indicator better represents the User Experience? est apparu en premier sur Quanta.

]]>
Why using a CDN is critical to your web performance https://www.quanta.io/blog/conversion/using-cdn-critical-web-performance/ Thu, 21 Dec 2017 14:15:49 +0000 https://quanta.io/?p=14645 L’article Why using a CDN is critical to your web performance est apparu en premier sur Quanta.

]]>

Here at Quanta, the one thing that concerns us is your web performance. We aim at becoming the best copilot there is, for the success of your e-commerce platform. And that is why today I decided to talk a little bit about CDNs.

As you surely know, the secret to the success of your e-commerce website is speed. Speed is what supports your user experience, your good loading times, and by extrapolation, what will affect greatly your turnover.

And that is exactly why using a CDN should be an arrow in your quiver of optimizations.

What is a CDN?

A CDN, or Content Delivery Network, is a network of servers that spreads all over the globe.

These servers, called CDN nodes, or Edge nodes, cache the static content of your website. And when I say static content, I’m talking about everything that a user spends most of his time waiting for: images, CSS, JavaScript files, …etc.

Image credit: Gtmetrix

In other words, Edge nodes store the content of your website in multiple location, and the node closest to your user will be the one delivering your website’s content to him.

Instead of waiting for the content hosted on a server in the US, for example, your user in Spain will have access to your content stored in Italy. All of this greatly reducing latency in the obtaining of data.

Why is it the best ally for my web performance?

There are three main reasons why it is highly recommended to use a CDN when you’re trying to improve your speed, user experience, and stability. And they go like this :

  • Using a CDN reduces latency

Like I said earlier, a CDN stores your static content in edge nodes, and delivers your content through the node geographically closest to your final user.

The first impact of a CDN on your overall speed is that your user don’t have to wait for the answer of your main server, but can get your content right away. If your website is hosted in another country (or even continent) from your user, the loading times can be greatly increased due to the latency inherent to the physical distance.

Image credit: Incapsula

A CDN thus increases your delivery speed and your user experience.

  • Using a CDN reduces the risk of crash

The second big advantage of using a CDN is that it greatly reduces the risk of downtime due to peak in traffic.

As we’ve said before, if speed is key in your website web performance, stability is also very important. And there’s nothing bad for your turnover like an unanticipated peak in traffic that causes your e-commerce platform to crash.

By allowing your users to access your content through the edge node closest to them, you reduce the risk of overload, by distributing the bandwidth between several servers instead of having your main server supporting the full weight of the traffic.

  • Using a CDN improves security

The third advantage of using a CDN is its impact on your website’s security.

Faced with an unprecedented increase in DDoS attacks, websites need to prepare and improve their security. A CDN, as is is located outside of your architecture, will act as shield for your main server.

And considering that all your static content is duplicated throughout the CDN, even if one edge node should fall victim to an attack, the rest of the network would take its place and supply your user with content, even if some latency should then be expected.

A CDN can be the first line of defense for your architecture.

Conclusion

A CDN is truly a top level optimization for your website. It will greatly enhance the speed of your website, particularly if you have an international user base, and thus increase loading times, user experience and turnover.

But, you don’t need to rush to implement it in your architecture. It’s an option in case you wish to push the limits of your web performance.

If your curious about the relevance of a CDN for your website, know that you can precisely analyse the time and speed of all your requests, in the Waterfall view of your Web Scenario dashboard, in QUANTA.

L’article Why using a CDN is critical to your web performance est apparu en premier sur Quanta.

]]>
How HTTP2 will boost your Web Performance! https://www.quanta.io/blog/news/http2-will-boost-web-performance/ Thu, 23 Nov 2017 12:42:37 +0000 https://quanta.io/?p=14325 L’article How HTTP2 will boost your Web Performance! est apparu en premier sur Quanta.

]]>

Those of you that are frequent readers of our Quanta blog know that we’re always looking out for new tech innovation that can greatly improve web performance for our e-commerce clients. And that’s why today, I chose to talk a little bit about the new HTTP/2 protocol.

HTTP/2 is a new and revised version of the HTTP/1 protocol, based on the innovations brought by the SPDY project. The numerous changes between the versions 1.1 and 2 of the HTTP protocol truly deserve to be explained, and that’s what I am going to do here, from a purely web performance based point of vue. Indeed, HTTP/2 contains interesting new measures designed to improve security (most notably due to the aftermath of the CRIME attack of 2012), but these specificities will not be discussed in this article.

HTTP/1.1, SPDY: The genesis of HTTP/2

First of all, let’s give Caesar his due: HTTP/1.1 was created more than 15 years ago, and the internet has changed tremendously since then. So, when talking about the inadequacies of HTTP/1.1, we must keep that in mind.

But even considering the context, it’s right to say that HTTP/1.1 has lived its time. Why? Because HTTP/1.1 is simply too resource hungry.

This protocol basically works by allowing only one request per TCP connection. At first, this rule was created to better control the congestion created by great amount of requests.

Due to the growing complexity of web pages, browsers tried to circumvent this rule by using up to 8 TCP connections to issue parallel requests. But not only is this technique performance hungry (due to the strain it puts on the network, and thus the client and server), it is not optimal (the TCP connections end up “competing” for the bandwidth allocation, as no hierarchy or prioritisation can be clearly established between them).

On the other hand, some tried to use HTTP pipelining (using one TCP connection to send multiple requests) to circumvent the HTTP/1.1 basic rule. But by doing so, they ran the risk of losing packets if the first one in line was to be lost (called, head of line blocking).

How does a classic HTTP request, HTTP pipelining, and Head of Line blocking works, by Jeffrey Bosboom

Thus HTTP/1.1 negative effect on the web performance was judged increasingly detrimental.

So, in 2009, the SPDY project was launched, to try and remedy the inadequacies of HTTP/1.1. SPDY was a Google project, and aimed at reducing the page load times, by implementing multiplexing (the possibility to allow multiple request and response messages to be in flight at the same time) and the prioritization of HTTP requests. This experience by Google slowly gained recognition and is widely used nowadays, even if the users generally don’t realize it. SPDY was thus chosen to be the basis for the first draft of HTTP/2.

What HTTP/2 will bring to Web Performance

As I said earlier, HTTP/2 is very different from HTTP/1.1. So, let’s take a look at the Web Performance orientated innovations that it contains.

HTTP/2 IS BINARY.

Contrary to the textual HTTP/1.1, HTTP/2 is binary, and thus relies on fixed-sized text fields. This makes the transfer and parsing job on the data much more efficient, compact, machine-friendly, and thus…faster. Being binary, HTTP/2 is also less prone to errors, which can definitively improve performance.

HTTP/2 IS FULLY MULTIPLEXED AND USES ONLY ONE TCP CONNECTION.

Here, we can really see the influence of the SPDY project. Like we said before, in the beginning, the single-TCP-connection rule was implemented in order to reduce congestion. But due to the growing complexity of web pages, browsers resorted to “cheating” this rule, thus losing the philosophy behind it. HTTP/2 re-introduces this rule but addresses the problems of HTTP/1:

  • Multiple requests and files can be transferred at the same time, thanks to a unique TCP connection.

Multiplexe diagram

  • Packets will no longer be dropped if the first one in the chain is lost, because the chain system no longer persist.
  • Parts of one message can be used by another message in order to pool the request efforts.
  • The competition between TCP connections no longer exists. The client prioritizes the multiple requests he makes, and only has to add new requests to priority-tagged fluxes for them to be treated first (like in the case of HTML or CSS requests).

Multiplexing and the single TCP connection allows for a client to use only one connection for all his requests, and in turn, improve loading and response times, and general speed. As speed is the key factor in Web Performance, improving it can only be beneficial to the user experience.

HTTP/2 ALLOWS FOR SERVER PUSH.

This allows a server to anticipate a user’s needs, by presenting him with content that might interest him, before he even got the chance to think about it. More precisely, it allows the server to push into the cache all the JavaScript, images and CSS elements associated with an HTML request, as soon as this HTML request has been made by a browser.

Illustration for the Server Push principle, by David Attard

Conclusion

It’s safe to say that HTTP/2 will bring the basic Web Performance of websites to a new level. And that can only be a good point for e-commerce websites that are more and more the focus of the demanding nature of internet users. But only time will be able to tell if it can withstand the ever faster evolution of internet and its usage.

If you wish to dive further into the specificities of HTTP/2, I recommend that you take a look to its dedicated GitHub, which was the main source of information for this article. 🙂

L’article How HTTP2 will boost your Web Performance! est apparu en premier sur Quanta.

]]>
QUANTA’s 5 tips for successful sales! https://www.quanta.io/blog/news/quantas-5-tips-successful-sales/ Tue, 18 Jul 2017 12:55:36 +0000 https://quanta.io/?p=11637 The first days of a sales period can represent up to 20% of the yearly turnover, for some e-commerce websites. So! To avoid any disturbance, here are QUANTA’s 5 web performance pointers to succeed in your sales. Launch a closed-access version of the website, in advance Some countries and states passed regulations to supervise sales […]

L’article QUANTA’s 5 tips for successful sales! est apparu en premier sur Quanta.

]]>
The first days of a sales period can represent up to 20% of the yearly turnover, for some e-commerce websites. So! To avoid any disturbance, here are QUANTA’s 5 web performance pointers to succeed in your sales.

Launch a closed-access version of the website, in advance

mise_en_production

Some countries and states passed regulations to supervise sales periods. For example, the French government forbids the display of discounted prices, on the first day of the sales period, before 8am. The big problem here, is that most customers are eagerly waiting far before that time! But a massive change in the contents of an e-commerce platform (for example the prices, or the design announcing “Sales”), when said platform is full of users, is sure to be too much for the servers.

Why? Simply because when this horde of users try to access pages that haven’t been put in cache yet (because they’re “new”), the CMS has to do complex calculations that will inevitably overcharge your servers.

So. To avoid this unpleasantness, it is far better to simply close your website the evening before the start of the sales. Trust me, you’ll lose very few customers that night because, come on! Who buys online the day before a sales period? 0_0 And you’ll definitely be thankful for this decision later. Little bonus pointer: put your design team to work and ask them to concoct a kickass maintenance page, with a little teaser video, a countdown, …etc. Now back to business! In parallel to this “closing down”, launch online, with a closed access, the “Sales” version of the website, only accessible to your IP address and your web agency.

This way, between midnight and the official sales launch hour, you’ll be able to check one last time the performance of your entire sales funnel, in its “sales” version, in the prod environment.

I’m insisting on the “in the prod environment”, because even if your web agency gave you the green light in the pre-prod setting, it is better to be safe than sorry. Always check the entire sales funnel in its prod version, before opening hours.

The results will be that, when the hour has come to reopen the website, you’ll just have to drop your beautiful teaser page, and you’ll immediately be able to welcome your customers. Your team will have gained some bags under their eyes during the night, but they will thank you later when they will not have to perform emergency fixings, in haste.

Automatically pre-load your caches

screen-shot-2017-01-05-at-14-16-04

Instead of trying to visit all your website’s pages, by hand, in their “sales” version, before the official launch, it is far more effective to use a bot to do it.

Why, you ask? Because that way, all the “sales” pages will be preloaded before the official launch of your “Sales” close-access website. They will be stored in cache (Varnish, Full Page Cache, or your CMS’s cache), before the arrival of your customers.

The page needs only to be accessed once, for it to be stored in the cache system. That is why a bot is needed; to gradually access to all your “Sales” website’s pages.

Here are some tools you can use for this operation:

Just a little warning. These are all “geeky” and powerful tools, so it is important to take your time to conduct the aforementioned operation. If not, you’ll take the risk of simply crashing your website with the influx of requests. So, discuss the “how/when to” with your web agency!

Forbid back-office actions on the first day!

screen-shot-2017-01-05-at-13-26-27

It only takes a simple click in your CMS’s back-office to ruin one of the most important business day of your year. Clicks in the back-office consume a lot of server resources, and it can be dangerous for the platform, especially when they happen at the same time of a buttload of orders.

Our advice? At the very least, inform your e-commerce / sales team that on the first day of the sales period, nobody is to use your CMS’s back office to add new products, or modify the existing ones’ descriptions.

On that day, it’s hands off! No more touch ups! And if you really want to have peace of mind, just plainly (temporarily) forbid the access to your back office to all non-tech/vital users. 😉

You don’t think that’s such a big deal? Well, we’ve seen and measured it, time and again, at QUANTA. A simple back office click can invalidate the caches of a lot of pages, and greatly affect a website (Cf point 2, for those of you that have been following :P).

More info on this subject here.

Temporarily add front-end servers

Meme - server

Warning, it’s time for a little tech talk! 😉 At QUANTA, we observed that for almost all CMS, but specifically for Magento, the peaks in customer activity induce peaks in the CPU load on front-end servers (those that generate the visual pages for e-commerce website), more than the load on database servers. Although it is usually all by itself, the database server is generally less solicited than the rest of the architecture.

And that’s truly for the better. Simply because it is complex to have several database servers working in parallel to hold a larger load, whereas it is very simple to add front-end servers to distribute queries between a string of servers (Commonly referred to as a “server pool”).

In other words, to sustain the load on the first day of the sales period, rather than multiplying the database servers, just multiply the front-end servers.

However, do not do this rashly. If you only had a single front-end server until then, it is important to verify that the architecture of your website is ready to accommodate other fronts. The potential implications are to be discussed beforehand with your agency, but the main one you should focus on is the user sessions (cookies). The question to ask yourself is whether a user session, in your configuration, is stored as a file in a directory on the front-end server, or if it is stored in a shareable system such as the database, or a cache server like Redis? The session must be shared and accessible by each of the front-end servers so that the user’s navigation (Login, Adding a cart) is not disturbed.

Imagine if your customer puts an article in his cart, then is redirected to a server that can no longer find his session and his product. There is a good chance that he will slam the door of your website.

But even with this key step to be watchful of, adding front-end servers remains the easiest way to increase the capacity of your site for the first days of sales.

Last but not least. Try to install your front-end servers 2 to 3 days before the sales period, so as to be sure of:

  • The good working of your sales funnel, on each of your servers (Yes, I already said that, and I’ll keep saying it.)
    To installation of your monitoring probes on these new servers, in order to have a complete mapping of the health of your architecture for the D-Day.

Run load tests

new_page_1

In order to be sure that you will be able to welcome more visitors than usual on your website, I advise you to simulate a peak of traffic higher than the one you expect on the day of the sales.

Yes, yes, and yes! Get some leeway! It would surely be extremely frustrating if your emailing campaigns worked too well, but if you ended up with a crashed website, thus torpedoing your turnover. By the bye! It reminds me of the story of an e-commerce director on the first day of sales… 😛

To perform load tests, many solutions exist, grouped in 2 categories:

  • House tests performed by your teams, with generally free and time consuming tools as siege, wget, curl, ab, etc.
  • “Pro” tests performed by independent third parties such as CloudNetCare, Neotys or QUANTA.

Whichever solution is chosen in the end, during these tests it is important to:

  • Simulate what a customer would do on your website, by basing the scenario of your load tests on the behavior of your average customer, monitored in your Google Analytics history.
  • Do your calculations so as to easily see what the limit is (before your website crash, ndlr). Unintelligible example of load test results: “We can handle 13 422 HTTP requests per minutes with 60% of CPU” “… Ok. Buuuut… Is that good or bad?” Intelligible example of load test results: “My website in “Sales mode” handles, without disturbances, 5 times the traffic recorded during our previous sales period, and this with virtual customers going through the whole sales funnel.” “Ok, then I am relaxed for the D-day.”

If you want to know more about important things you should keep in mind before you get started; we already wrote a comprehensive guide on the implementation of load tests.

Good sales to you!

giphy-3

L’article QUANTA’s 5 tips for successful sales! est apparu en premier sur Quanta.

]]>
What’s Up Quanta #10 – Our Front-End monitoring is there! https://www.quanta.io/blog/news/whats-quanta-10-front-end-monitoring/ Thu, 06 Apr 2017 10:46:54 +0000 https://quanta.io/?p=11021 It’s not been long since we wrote a What’s Up Quanta (announcing that we now have an integration with Blackfire) but considering that we love bringing to life new features, we thought that it was high time to talk a little bit about our recent front-end monitoring feature. 😉 Why monitor an e-commerce website? For […]

L’article What’s Up Quanta #10 – Our Front-End monitoring is there! est apparu en premier sur Quanta.

]]>
It’s not been long since we wrote a What’s Up Quanta (announcing that we now have an integration with Blackfire) but considering that we love bringing to life new features, we thought that it was high time to talk a little bit about our recent front-end monitoring feature. 😉

Why monitor an e-commerce website?

For those of you that, like us, are passionate about web performance and e-commerce, you should have seen lately that everybody is talking about the users’ cry for instantness.

E-customers don’t want to wait. If they can’t buy online in a timely manner, and on a website which offer great UX, they will simply leave.

You’ll thus have lost both a chance to sell, and a chance to establish your reputation. And we all know how much money it cost to attract new customer, instead of simply retaining your current ones…

That being said, if you want to reduce your loading times, increase your conversion, retain your customer, AND offer a great UX, then web performance should be your number 1 priority.

And to optimise your web performance, you need to carefully monitor your website. And not just the back-end (which means everything that is invisible to your users : server, response time, network, …etc.)! You need to keep an eye on your front-end (which means everything that is visible to your users : images, pages, links, tags, …etc.) also, to have a complete pictures of where your e-commerce platform could be optimised.

How does our front-end monitoring feature works?

Until recently, Quanta’s main focus was to offer precise, clear and constant back-end monitoring. It means that thanks to our app’s profilers and dashboards, you were able to monitor your back-end and keep track of any incident or slowdown that your server, network, databases, …etc. experienced.

With the front-end monitoring feature, you have the opportunity to analyse more thoroughly the web performance of each page of your e-commerce funnel.

Quanta’s probes now analyse the loading times of the front-end elements (images, CSS, JS, external tags, …etc.) on each of your pages.

waterfall-view-gtmetrix-wuq10

Our waterfall view

Our waterfall view allows you to track down each resource loaded. As such, you know what happened, for how long, and when! Also, to help you identify the most important issues, we provide you with Google Pagespeed recommendations.

overview-gtmetrix-wuq10

Waterfall of the category page

In other words, you can now discover which of your super heavy kitten image, or which JS line written by your freshly arrived intern, is jeopardizing your web performance and thus your revenue, AND take action right away to correct it.

Pretty neat, ha?

Results?

Some of our clients already tried it, and had mind-boggling results. Seriously.

head-explosion-gif

The first place actually goes to Madura, who reduced its frontend page loading times by 20%, in just one week!

But we’ll soon do a Case Study on this subject, so stay tuned to read about the real life applications of this new feature.

L’article What’s Up Quanta #10 – Our Front-End monitoring is there! est apparu en premier sur Quanta.

]]>
Heads of E-commerce, it’s time to choose the red pill! https://www.quanta.io/blog/news/heads-of-e-commerce-its-time-to-choose-the-red-pill/ Mon, 20 Feb 2017 17:30:30 +0000 https://quanta.io/?p=10509 If there’s one thing that’s true today, it’s that there really is no shortage of e-commerce websites. So it is essential for Heads of Marketing to find innovative ways to stand out in this cluttered market. To achieve that, not only do they use their creative sides to attract new visitors, but they also focus on […]

L’article Heads of E-commerce, it’s time to choose the red pill! est apparu en premier sur Quanta.

]]>
If there’s one thing that’s true today, it’s that there really is no shortage of e-commerce websites. So it is essential for Heads of Marketing to find innovative ways to stand out in this cluttered market. To achieve that, not only do they use their creative sides to attract new visitors, but they also focus on web performance and SEO.

However, as the technology in this areas becomes more and more particular, Heads of Marketing tend to multiply the apps they’re using and the companies they work with. Instead of doing everything internally, they tend to build their marketing castle with bricks from their own teams, a web agency, an SEO agency, and their hosting services.

As a result, they can easily loose sight of the big picture, of who is doing what exactly, thus leaving them clueless regarding the priorities to be set in order to continue their growth.

But that’s a thing of the past. Not unlike Neo in “The Matrix”, they are more and more eager to choose the red pill, and finally take back control. They want to know what’s truly going on with their website, in all its aspects and specifics. They don’t want to be confined to a distant overseer position that doesn’t understand the technical challenges that an online store presents, and rely too heavily on others opinions to decide what’s good or bad for the future of the business.

So, hop into the Nebuchadnezzar, and take back control of your website’s processes, with the following 3 tips.

You will attract more visitors if you know who they are

Firstly, if you want to attract more visitors to your store, you have to target them specifically. No more nets thrown haphazardly in the big sea of the internet. You should start by analyzing the profile of your customers and understand what their origin is.

Some marketers leave this analysis to external agencies. These agencies collect the data and then paint a portrait of the typical customer. But that’s not the only solution.

Other marketers use attribution tools to take back control of their data and due the analysis internally. This allows managers to see how visitors find their website and what lead them to buy. More precisely, an attribution tool assesses if a marketing campaign is successful or not.

Take a well known attribution tool like Mazeberry for example. It displays performance indicators, and sort all the data it receives in order to identify the most valuable indicators for you to use. Google Analytics, and Bizible are also interesting for this task or Merchandising.io for a more specific “product performance” view.

Managing an advertisement process efficiently can greatly help cut back the customer acquisition costs, so attribution tools can become a powerful asset in a marketers toolbox. They can reduce the customer acquisition costs by up to 15%.

You really need to schedule the publication of your content

Now that you are able to track the origin of your visitors and to know which meta-tags you should improve, you must keep your visitors on your website.

But these users have a limited amount of time to dedicate to your site, and are likely to visit at a particular time of day. Given this, scheduling the moment you post your content into their mailbox can really improve customer retention. However, to do this, you will need to dive into some metrics to understand what and when to reach your prospect as a way to have a maximum of impact in a minimum of time.

And that’s when automation tools really comes in handy. 😀

Usually, these tools will give you precise indicators to identify the marketing channels (emailing campaign, social networks, paid publicity, …etc.) that bring the most visitors / customers / leads to your website.

They also allow you to preset messages to post on the best-known social media sites, manage your emailing campaigns, …etc. An automation tool like HubSpot, for example, even offers you automatically to post on the most relevant time (determined by the results of your previous marketing campaigns)! But it’s not the only name in the field. Datananas, IKO System, Hatchbuck, Hubspot or Marketo are also great options as automation tools.

If used correctly, these tools greatly increase the chances of conversion.

You need good analytics to know if each page is Web Performant

An online store has to be updated regularly, to add new articles or change some prices for instance. But the more your website grows, the more you put yourself at risk of experiencing slowdowns and unavailability periods. Also, a slow website usually entails a deteriorated customer experience (Well yeah! Nobody likes to wait long when shopping online!), and we all know that, that it leads to a significant loss of visitors, and thus, of sales.

And that’s when you enter the world of web performance and optimization.

You want to expand your online store, and attract more and more customer, but without losing any speed or quality of UX? Well optimization is there for you.

But before optimizing your website, you need to find out where do the issues come from.

“Should I focus on the homepage, or the cart first?” “Are my web developers responsible, or my hosting provider?” To answer those questions, you may want to use a piloting tool. These tools will allow you to monitor your store and extract the data relating to the performance of your website. This is the purpose of tools like Google Analytics, Pingdom, Dareboost, Blackfire (a tool with a definite “tech/dev” vibe, that goes very deep into application optimizations) or Quanta.

As an piloting tool, Quanta extracts both real user data (using Google Analytics) and backend, frontend and server metrics from e-commerce websites, analyses it, and identify the root cause of the slowdowns. The tool, intended for Heads of E-commerce and Marketing Managers, aims at helping you prioritize the best-suited optimizations needed for your website, by presenting the data in clear business dashboards, with actionable indicators.

So that’s it for these 3 tips to take back control of your e-commerce business growth!

Now you are truly ready to set aside the blue pill, and delve into the truth of your data. To take good decisions, don’t be afraid to delve into the technicalities of your website anymore, and check out these solutions, as they will help you concentrate on your real job: managing your team and your development strategy.

L’article Heads of E-commerce, it’s time to choose the red pill! est apparu en premier sur Quanta.

]]>
14 server knacks to optimize your Magento store’s performance! https://www.quanta.io/blog/performance/14-server-knacks-optimize-magento-stores-performance/ Wed, 23 Nov 2016 16:29:01 +0000 http://blog.quanta.io/?p=9329 Welcome to the first blog post, in a 3 part series, about the little knacks that you could implement to boost your Magento’s performance. Web performance is key when managing an e-commerce website. Indeed, as you may already know, the speed of an online store is crucial for conversion, and thus profitability. Customers are less […]

L’article 14 server knacks to optimize your Magento store’s performance! est apparu en premier sur Quanta.

]]>
Welcome to the first blog post, in a 3 part series, about the little knacks that you could implement to boost your Magento’s performance.

Web performance is key when managing an e-commerce website. Indeed, as you may already know, the speed of an online store is crucial for conversion, and thus profitability. Customers are less and less prone to wait for a page to load, when purchasing online. Conversely, users are very likely to make an impulse buy if your website is fast, since speed improves user experience.

But it’s not just because of conversion that web performance should be a main focus point of yours. There is also the little matter of SEO that has to be taken into account. Indeed, Google takes into account your website’s speed to build its index: the fastest your website is, the more likely you will be to find your website at the top of the search results, and thus, attract new visitors.

However, if Magento remains the leading e-commerce solution on the market, it has a reputation of being quite slow (well… Not as slow as PrestaShop, WooCommerce, and Open Cart, but still… :P), thus making it essential to optimize said Magento-run websites.

So, let’s quickly review together today this list of server optimizations for a Magento-based e-commerce website!

  • Get a dedicated server, and allocate enough resources to it.
  • Check that you have an architecture adapted to your needs.
  • Disable or lower logs servers (it can impact Disk I/O).
  • Reduce the physical distance between your customers and your website by: Bringing servers closer to your customers (kind of like the little surprise that QUANTA is preparing in Asia… 😉 ), and Using a CDN.
  • Use a reverse proxy to avoid using PHP-capable workers to handle slow clients, serve static files, …etc.
  • Use PHP-FPM: It does an awesome job as a PHP worker pool and it’s the best way to have an efficient Opcache.
  • Use KeepAlive directives to manage several sessions on competition on a single TCP connection.
  • Activate Gzip compression.
  • Update your applications (e.g Nginx, MySQL) on a regular basis.
  • Optimize MySQL performance.
  • Install APC to speed up your code execution time.
  • Disable the Apache/Nginx modules you don’t need.
  • Uninstall debugging libraries (e.g xdebug, zend debugger).
  • Use SSD hard drives.

Remember that depending on your architecture, your hosting and/or your server, these solutions could bring disappointing results. That is why a monitoring tool could help you to prioritize the best-suited optimizations for you.

But if everything goes according to plan, applying these principles will let you experience a significant improvement in page loading times.

This list, of course, is not exhaustive. But it’s a start 😉 Feel free to contact us if you have any other performance-related tips to share with us!

L’article 14 server knacks to optimize your Magento store’s performance! est apparu en premier sur Quanta.

]]>
Beware of the rise of I/O on an e-commerce architecture https://www.quanta.io/blog/performance/disk-io-magento/ https://www.quanta.io/blog/performance/disk-io-magento/#respond Thu, 04 Jun 2015 11:41:06 +0000 http://blog.quanta-computing.com/?p=1821 Maintaining good performance on your e-commerce shop requires paying attention to all details. On the server side, one piece of data should be monitored carefully: the usage rate of the “Disk I/O”. The term increase of “Disk I/O” refers to the phenomenon that occurs on your server if it is not able to read or […]

L’article Beware of the rise of I/O on an e-commerce architecture est apparu en premier sur Quanta.

]]>
Maintaining good performance on your e-commerce shop requires paying attention to all details. On the server side, one piece of data should be monitored carefully: the usage rate of the “Disk I/O”. The term increase of “Disk I/O” refers to the phenomenon that occurs on your server if it is not able to read or write quick enough on hard drives (or on a network drive if you have a NFS server). So do you avoid this phenomenon?

Monitor your I/O rate to avoid slowdowns

If you have launched heavy queries on the database (an import in the back-office? a flow? or a backup at the wrong time?), the drives become more requested and Magento faces many difficulties to run. An action is taken at the wrong moment, and the iowait rate increases…

io-one

Without proper monitoring or corrective action, we have witnessed iowait rates (system wait times for writing or reading data) increase, sometimes up to 40%! This means that 40% of the CPU activity has to wait for the execution of a reading or writing operation on a drive. It is a monumental waste of server resources which greatly slow down Magento’s operation.

In an ideal world you should always try to keep a CPU graph that looks like this (see below the image of a healthy CPU graph):

io-two

In this example, with a limited iowait, a CPU occupation rate spent mainly in the blue area (the “user time”), the site works perfectly, queries are executed quickly… and visitors can enjoy your services in good conditions (of course, as long as the CPU usage does not come close to 100%! If this occurs, you will have to optimize Magento or add additional servers to your architecture).

What should be avoided at all costs is an increase in the iowait. When you have a peak in the iowait, serious consequences become visible on the load times of the site. For example, we see the site’s response time blow up with the increase of the iowait rate:

io-three

Pages take much longer to be generated by Magento, the experience of visitors is altered, and the conversion as well! In this event, Magento will do everything to try to deliver the page, but it does not have enough material resources available. Therefore, each process will take much longer to be delivered, and the site will become slower.

L’article Beware of the rise of I/O on an e-commerce architecture est apparu en premier sur Quanta.

]]>
https://www.quanta.io/blog/performance/disk-io-magento/feed/ 0
Improve Magento Performance using a CDN https://www.quanta.io/blog/performance/improve-magento-performance-using-a-cdn/ https://www.quanta.io/blog/performance/improve-magento-performance-using-a-cdn/#respond Wed, 25 Feb 2015 15:49:21 +0000 http://blog.quanta-computing.com/?p=1389 You may have already come across the acronym CDN for Content Delivery Network, but do you know exactly what it means? Increasingly, Magento sites improve their performance by acquiring a CDN. It also seemed appropriate to describe more precisely how it works. Explanation… The problem with the response time of many media files On your […]

L’article Improve Magento Performance using a CDN est apparu en premier sur Quanta.

]]>
You may have already come across the acronym CDN for Content Delivery Network, but do you know exactly what it means? Increasingly, Magento sites improve their performance by acquiring a CDN. It also seemed appropriate to describe more precisely how it works. Explanation…

The problem with the response time of many media files

On your ecommerce site, you have dozens or even hundreds of files that are downloaded after retrieving the HTML code from the site generated by Magento. They typically have to do with images, CSS or JS files, which are also downloaded from the same server that hosts your Magento site.

Now imagine that your server is hosted in France but that one of your potential clients connects from Canada. Or, alternatively, imagine that your data are hosted in Paris but a visitor connects from Nice.

As you can imagine, crossing the Atlantic or even a country, will make your site a bit “slower” from the visitor’s perspective. Each file recovery can add tens of milliseconds, which add up for every downloaded media and end up counting in additional seconds!

The solution using a CDN

You can set up your website so that it uses a CDN. This means you will have to make a copy of your media files (usually the images on your site) in different datacenters, spread out all over the world, which will then directly send the requested files to users. For visitors, the data will load from the site closest to them hosting a copy of your files, which reduces distance and gives back better performance for your Magento site. The CDN principle works in the same way with other CMS such as WooCommerce, Prestashop or Hybris. With shorter travel back and forth between the user and the servers, the response time is reduced. This will significantly improve the score when downloading images!

Setting up a CDN can prove to be more or less complex without technical knowledge. Keep in mind that you do not need to use a CDN to host your entire site. Only static files that rarely or never change such as Javascript files, CSS style sheets and images should be placed on a CDN. Dynamically generated content with PHP scripts for example, should however remain hosted on the original server.

Using a CDN is an advanced optimization. It’s not a small change in code or a solution that you can easily achieve on your own if you do not have technical profiles in your company. Investing in a CDN depends on your priorities, goals, and also on your budget (using a CDN will generally require an additional hosting cost from a specialized provider, but given the speed impact, it will be profitable).

If you’re an online retailer, begin by asking your hosting partner if they have a CDN and if it may benefit you, otherwise your web agency or your hosting partner should be able to help you set it up.

L’article Improve Magento Performance using a CDN est apparu en premier sur Quanta.

]]>
https://www.quanta.io/blog/performance/improve-magento-performance-using-a-cdn/feed/ 0
Adapt your site to mobile users with Page Speed Insights https://www.quanta.io/blog/how-to/mobile-ecommerce-page-speed-insights/ https://www.quanta.io/blog/how-to/mobile-ecommerce-page-speed-insights/#respond Tue, 06 Jan 2015 12:18:58 +0000 http://blog.quanta-computing.com/?p=781 Do you know Page Speed Insights? This handy tool, made by Google, measures the performance of a page on mobile devices and fixed devices (desktop PCs). Site performances are evaluated and a final score (from 0 to 100) allows you to situate yourself. Page Speed Insights and loading time  Page Speed Insights focuses on two […]

L’article Adapt your site to mobile users with Page Speed Insights est apparu en premier sur Quanta.

]]>
Do you know Page Speed Insights? This handy tool, made by Google, measures the performance of a page on mobile devices and fixed devices (desktop PCs). Site performances are evaluated and a final score (from 0 to 100) allows you to situate yourself.

Page Speed Insights and loading time

 Page Speed Insights focuses on two essential loading times:

  • the loading time of content above the waterline (time between the user’s query and the complete display of elements on the screen);
  • the time to complete loading the page (when all elements are displayed).

Page Speed Insights turns out to be very useful because it focuses on performance related to the processing of the page by the user’s browser. This focus, which allows setting aside network performance, gives a global scope of your site’s performance, and it shows areas to be improved in order to make your site evolve. Google improves its tool frequently, and in 2014 it implemented a number of innovations. Emphasis was placed on the evaluation of site performance on mobile devices (smart phones and tablets) and on improving usability.

Evaluate the speed of your mobile site thanks to Page Speed Insights

Since mid-2014, Page Speed Insights integrates new recommendations for navigating mobile sites. Google shows its commitment in providing solutions to webmasters, to help them have mobile-friendly sites. “Suppose your mobile site takes 2 seconds to load instead of 7 as it used to. If users still have to spend 5 more seconds waiting once the page has loaded in order to zoom or scroll the screen before they can read the text and interact with the page, then the site is not truly fast. The new features of Page Speed Insights can help you find and fix these usability issues,” says Google in this post that announced the arrival of updates.

L’article Adapt your site to mobile users with Page Speed Insights est apparu en premier sur Quanta.

]]>
https://www.quanta.io/blog/how-to/mobile-ecommerce-page-speed-insights/feed/ 0