Article

24.07.2017

Big data challenging traditional purchasing decisions

Going forward, Big Data is enabling companies to make more informed purchasing decisions: this should lead to fewer losses, overstocks or products being out-of-stock. What role does the purchasing director play in this?

Too much unnecessary stock threatens your company's growth. What is the solution? Optimise purchases in real time to reduce production costs, improve cashflow and retain total control over your budgets. According to a survey carried out by Lightspeed POS, 54% of companies rely on the profitability of Big Data to make smarter purchases. However, the right technologies still need to be invested in and, as part of that, the role of the CPO (Chief Procurement Officer) needs to be revisited.

The digital imperative

These days, buyers clearly need to be able to analyse on a predictive basis upon request and want to have a sourcing overview. Using Big Data can help with this by enabling a certain amount of automation in terms of procurement, particularly with regard to direct purchases. Going forward, this can be achieved by Big Data solutions through traceability. These solutions can take several forms: stock movements (thanks to the IoT and sensors), tracking of sensitive or fresh products, tracking fleets and haulages, and business intelligence to capture demand.

Yet, adopting these solutions will involve an organisational change with regard to the role of buyers, as indicated in the latest Deloitte Global Chief Procurement Officer Survey 2017 report. What are the major trends?

  • Digital is at the heart of purchasing decisions

    75% of purchasing directors believe that their role in delivering the company's digital strategy will increase. Moreover, 75% have executive support to achieve this. Analytics, already used (by 58% of respondents) in negotiations and (by 57% of respondents) to improve process efficiency, is the technology which will have the greatest impact on purchases over the next two years. What are most purchasing directors focusing on? Half of purchasing directors say that the quality of the data is the most important thing.

  • Cost reduction

    There is still uncertainty surrounding the economic and geopolitical environments. The number one priority for 79% of those surveyed is reducing costs to achieve the best results and finance growth. 

  • Redefinition of roles and war for talent
    With automation, the role of purchasing departments is changing: when contract and procurement monitoring is left to artificial intelligence, buyers can concentrate more on upstream needs, managing data and governance. 60% of purchasing directors do not believe their teams have the skills to deliver a procurement strategy. This is an issue because 87% of them recognise that these skills are the greatest factor in driving procurement performance.

 

"The traditional operating models for procurement are currently changing. This is being led by a lack of talent and digital innovation growth."
Magali Testard, Partner in Charge of Procurement & Supply Chain Advice, Deloitte.

 

Article

24.10.2016

Technology is transforming the supply chain

The rate of adoption of new technologies in the supply chain has not yet passed 26%, but it should reach 75% in ten years' time. What does this mean for you and how can you prepare for it?

The integration of technology into the supply chain has long been the exclusive preserve of major companies, often motivated by the idea that their departments or subsidiaries will become more efficient by being interconnected. The most striking example of this is the adoption by FedEx in 1985 of a system of handheld scanners, enabling real time package tracking. A revolution back then, but commonplace today!

"Since the 1980s, computer technology has advanced at such a phenomenal rate that it is currently far ahead of the ability of the supply and logistics field to adequately utilize the new technologies", according to Georgia Tech.

In the midst of the revolution in mobile computing, the latest white paper from C3 Solutions – the source of the adoption figures quoted above - explores how IT will shape the supply chain in years to come, thanks to three basic levers:

1. Mobility is more accessible

The policy of “bring your own device” (BYOD) has spread throughout businesses and brought down technological barriers, a move facilitated by falling equipment prices. So, an iPhone 5 has 2.7 times the processing power of the Cray-2 which FedEx had in 1985. The industries that benefit most from this are in logistics (dispatch management, GPS navigation, dock scheduling, parcel tracking, proof of delivery and customer service).

2. Cloud, SaaS and Big Data are becoming instinctive

Like mobile supply chain apps, even though a strong trend in IT, cloud computing is still immature and has yet to truly develop its potential. Although the costs are difficult to calculate, the roll-out of SaaS (software as a service, where the company's applications are based in the cloud) is becoming more and more natural, as it is inherent in the digital nature of the supply chain, particularly in view of the proliferation of connected items (the Internet of Things) in the business environment.

Cloud-based apps are the ideal way to gather data silos for analysis, but Big Data is not a new phenomenon for brands. What's different now is the velocity of solutions compared with the volumes processed and the availability of analytical tools to use this data. Accenture notes: structures that use analytical solutions have faster reaction times (47%) in their supply chain.

3. Web APIs also reign outside "dotcoms"

Application Programming Interfaces have become the plumbing of the Internet of Things. They enablecommunication between different services or apps. For example, US giant Target required its suppliers to connect to its own APIs in an attempt to curb stock-outs. The company managed to reduce delivery windows from two days to one. It also penalises suppliers using outdated tools if deliveries don't arrive in the scheduled window.

Intelligent employment of these tools should result in significant supply chain optimization: increased visibility, better cost control, more efficient integration between companies, more accurate tracking and planning, not forgetting improved regulatory compliance. To gain a better understanding of the issues involved, read the white paper entitled "Technology Reshaping the Modern Supply Chain" published by C3 Solutions. It can be downloaded for free by simply registering on the publisher's website.

Source : C3 Solutions

Article

12.12.2016

What "local" contributes to your supply chain

Countries that have spent the 30 last years closing their factory doors are seeing an unexpected return to local industry. The short chain has maybe even taken on a certain value in an unstable economic climate.

From the start of 2016, international exchange volumes have been falling. For the first time since the Second World War, a commercial treaty with the United States has failed during a growth period. Both candidates in the American presidential elections recently opposed the Trans-Pacific Partnership in order to embrace "Made in the USA". The United Kingdom voted for Brexit. In Belgium, Wallonia went head-to-head with the European Union and Canada over the CETA. In other words: globalisation is no longer an unshakeable conviction.

A survey carried out by SCM World indicates that in the United States, companies are three times more likely to recruit (than dismiss) staff in the supply chain area. The trend is identical in European countries such as Germany, the Netherlands and even Great Britain, as the diagram below shows.

WB_Art_Local_supply_chain_en

(Source: SCM World http://www.scmworld.com/wp-content/uploads/2016/11/161103-November-KOM_Image1-Web.png)

Made in China was synonymous with everyday low prices. The paradigm has shifted.

 

During the 90s, international exchanges grew twice as fast as the international economy. Europe united under a shared currency. China became the world's factory. Tariffs fell, as did transport costs. Nevertheless, if you believe the New York Times, the Walmart revolution was over. So China joined the former prosperous nations club: she used her factories to build a middle class. As for Europe, from now on economic stagnation makes the signing of trade exchanges more complex.

Conclusion: local is back and redesigning the supply network rules. Why?

Distance increases costs

It makes no sense to boast about container capacity and the fall in the price of oil. Transport costs from Asia to Europe are more expensive and riskier than before. Add to this the doubts surrounding trade regulations.

Local is more flexible

The example of Trellebord in Sweden is particularly enlightening. This SME chose to use robots in order to automate a part of its work in a high wage country without having to go through relocation.

For Kevin O'Marah, SCM World supply chain expert, "adaptable platform design is Western industry's best defence against devastating supply constraints." The more such vital inputs can be shared, the better the business can support local production with limited bill-of-materials risk. This is true for Mondelēz, which is pioneering this approach in food, or BMW in automotive."

 

Article

14.12.2016

Tant qu’il y aura des data…

Le graal des big data? Créer une expérience client sans précédent. Mais pourquoi une start-up sans passé réussit à être affective là où ses ainées croulant sous les data rêvent de proximité ? Quelle est l’alchimie gagnante ?

Stocker et traiter ses données numériques, ce n’est pas nouveau. Le datamining non plus. Mais avec les objets connectés et les usages mobiles, les données déferlent littéralement. SMS, chats, photos, vidéos, requêtes à un moteur de recherche, clics sur le net, demandes d’itinéraires sur google maps ou autres, paiements en ligne, contacts client par chatbots ou messagerie, renouvellement automatique des commandes à partir d’un frigo intelligent… des données, nous en produisons sans cesse sans même nous en rendre compte ! Même lorsque nous acceptons la géolocalisation ou que nous nous connectons à une borne wifi...

En 2020, le volume des données devrait être multiplié par 50. Une voiture connectée, par exemple fournit, en une heure de temps, des millions de données utiles à l’automobile, mais aussi aux assureurs aussi ou à l’e-commerce. Et les enjeux ne sont pas moins prometteurs qu’ajuster sa stratégie, personnaliser un service, prendre de meilleures décisions, détecter des tendances, établir des prédictions… 

Il y a toujours eu des statisticiens pour interpréter les chiffres du passé afin d’améliorer le futur, mais aujourd’hui les ‘data scientists’ sont des geeks. Des cursus universitaires voient le jour et l’explosion des données adopte un rythme quasi insoutenable pour que nos connaissances puissent suivre. Seules des machines sont encore à même de gérer de tels flux de données. Les techniques d’apprentissages automatiques (‘machine learning’) permettent de faire mieux et plus rapidement. Un standard pour une utilisation correcte de l’intelligence artificielle serait en cours à l’initiative de noms comme Google, Facebook, Amazon, IBM et Microsoft. Pour Nicolas Méric, fondateur et PDG de la start-up DreamQuark, acteur de deep learning appliqué à la santé et l'assurance, de telles technologies dopent les capacités humaines mais elles ne sont pas vouées à pouvoir s’en passer.

Qui est concerné ?

Aucun secteur n’échappe vraiment au besoin de récolter ses données afin de les faire fructifier en transformant son environnement. Mais disons que certains se montrent plus pressés – ou opportunistes - que d’autres. Les télécoms, le transport, les fournisseurs de gaz, eau, électricité, émergent : la SNCF mais aussi le fabricant de produits de beauté Nuxe épient tous les canaux en ligne en quête de verbatim client pour mieux le connaître. L’ascensoriste ThyssenKrupp, qui veut chouchouter ses cabines et surtout leurs utilisateurs, récolte moult paramètres sur celles-ci afin de parfaire la maintenance et d’anticiper les pannes désagréables.

Les responsables des Big Data en entreprise sont face à trois défis principaux, rassemblés sous la règle dite des ‘3V’: pouvoir gérer de gros Volumes, tenir compte de l’infinie Variété des informations, et parvenir à gérer la Vitesse à laquelle elles sont générées. Les banques n‘y échappent pas. Ces entreprises qui ont d’ailleurs beaucoup à y gagner puisqu’elles disposent de tonnes d’informations transactionnelles sur leur clientèle et créent des processus en tout genre, sont mises au défi : celui de se servir d’un tel trésor pour tester elles aussi de nouveaux services à valeur ajoutée dans un délai le plus court possible.

Momentum

Jean-François Vanderschrick est Head of Marketing Analytics & Research chez BNP Paribas Fortis : « Ce qui me fascine, c’est moins la multitude des données disponibles et des objets connectés que tout ce que la technologie permet désormais d’en tirer. Pas un jour ne se passe sans que je ne sois surpris par quelque chose de neuf. JP Morgan détecte des tendances en achetant les photos de l’occupation des parkings des supermarchés. La Chine développe la reconnaissance faciale pour adapter le lay-out de ses interfaces à l’expression de ses clients. Vous pouvez suivre à la trace votre paire de chaussettes made in USA de son expédition jusqu’au moment où elle franchit le seuil de votre domicile… Tout cela fait partie de notre quotidien au moment même où une banque manifeste ses intentions de s’adapter à la phase de vie que traverse son client – celui qu’elle suit depuis qu’il est actif – pour lui offrir juste ce qui lui est utile. »

Chez BNP Paribas Fortis, le management data franchit récemment un nouveau pas avec la nomination d’un Chief Data Officer membre du Comité Exécutif, Jo Couture. Ce qui signifie aussi des renforts humains, de nouveaux outils analytiques et de nouvelles capacités.

Jean-François Vanderschrick : « Les data analytics doivent nous permettre d’améliorer l’expérience client, ainsi que de garder les coûts sous contrôle et in fine, cela conduit généralement à une plus grande efficacité. »

Selon lui, la courbe d’adoption entame à peine sa phase exponentielle.

Le timing est aussi important que le service lui-même

Les données servent une multitude de domaines : excellence opérationnelle, marketing, détection des fraudes, risque crédit… Les entreprises comprennent désormais qu’elles doivent transformer leurs données en connaissances et en services et bon nombre d’entre elles ont tout pour y parvenir. Toutefois, il convient de ne pas se laisser noyer par la masse d’informations. Le plus compliqué - et source de frustration - est sans doute de pouvoir accéder aux données et de parvenir à les qualifier. Les aspects de compliance ont naturellement tendance à freiner les développements. Réduire le data to market reste cependant un défi majeur car souvent, le timing de la mise sur le marché s’avère bien trop long. Il s’agit aussi d’offrir un service en temps réel, comme c’est le cas chez Monoprix qui analyse le processus de traitement de 200 000 commandes quotidiennes de ses 800 magasins pour intervenir directement sur sa chaîne d’approvisionnement, un processus critique pour l’enseigne française.

« C’est une délicate alchimie à produire entre les tests (la maquette du service est souvent très chouette, mais encore faut-il réussir la généralisation), la mesure du risque et la ‘prioritarisation’ des objectifs », soutient Jean-François Vanderschrick.

Eduquer l’algorithme

Pour peu que l’on dispose des données et de la technologie, et qu’il y ait des enjeux financiers liés, l’imagination reste notre seule limite pour libérer la valeur des données. A côté de projets conséquents et complexes, des quick wins relativement simples sont ici aussi tout à fait possibles et souhaitables, notamment pour permettre aux directions opérationnelles de l’entreprise d’effectuer des analyses élémentaires à partir de grands volumes de données. « Aujourd’hui, une variété d’informations qui semblent peut-être anodines peuvent nous éclairer et servir de déclencheur d’actions : un client qui commence à travailler avec la concurrence, qui place des lignes de crédit ailleurs, ou emprunte un montant particulièrement important, traite avec un autre pays… autant d’informations qui commercialement parlant, méritent toute notre attention et qui sont jugées utiles dans 70 % des cas » ajoute le responsable de BNP Paribas Fortis. Analyser le modèle transactionnel d’un client permettrait de prendre de meilleures décisions de crédit. Il est possible d’améliorer de manière conséquente la pertinence des décisions, comparé à ce que nous pourrions faire sans modèle, prétend Jean-François Vanderschrick qui ajoute encore :

« Grâce au machine learning, nous éduquons l’algorithme à fournir des réponses de plus en plus pertinentes.» 

Si ‘Big is better’, est-ce accessible aux petites ?

Grâce au Cloud (espace en ligne), les PME disposent désormais des capacités de stockage - auxquelles s’associe la puissance de calcul nécessaire pour exploiter les données. C’est un des enjeux majeurs des Big Data. Le second est de savoir comment les traiter. Des logiciels de gestion d’entreprise usant de la technologie cloud, style CRM, outil de suivi des commandes ou des coûts de production, traçabilité des fournisseurs, rendent les big data accessibles aux petites et moyennes entreprises. Seule condition : rassembler toutes ses données au même endroit. La différence entre les corporates et les PME se jouera sur le long terme. Mais les PME pour qui un super statisticien serait impayable, peuvent toujours acquérir des études ciblées et enrichir leurs données par des bases externes…

(Sources : BNP Paribas Fortis, Les Echos, Transparency Market Research, IDC, Ernst & Young, CXP, Data Business)
Article

27.12.2016

These 4 giants from Silicon Valley want to seduce your IT management

Already champions in everyday life, Google, Facebook, Slack and LinkedIn are adopting innovative and complementary approaches to convert companies. What strategies are they implementing in order to convince you?

Google: the value of data intelligence

Google is adopting an approach which goes beyond communication tools and suites of productivity apps/services. The company has largely transformed its business divisions so that they can exploit cloud infrastructures, big data, analytics and machine learning as a matter of priority. Two competitors are blocking it along the way: Amazon and Microsoft, but for different reasons. Developers have been using Amazon Web Services for a long time, which gives it a history of trust. Microsoft (Cloud, Office) also has a historical presence in IT departments around the world. In this approach, linked to the processing of sensitive data, Google still needs to evangelise: a company is not as easily convinced as a consumer, particularly when it comes to strategic or confidential data. Its weapon: the power of its artificial intelligence tools to process data silos.

Facebook: introducing WorkPlace, naturally

After more than a year of development with partner companies such as Danone, Starbucks, Royal Bank of Scotland and Booking.com, Facebook officially launched WorkPlace last October. This Facebook spin-off enables organisations to create an internal social network - completely private and secure - within an interface familiar to all employees in their everyday life, introducing head-on competition for already widespread tools such as Chatter (Salesforce) or Yammer (Microsoft). Unlike free Facebook, WorkPlace is billed monthly depending on the number of users: $3 for the first 1,000, $2 for the next 9,000 $1 for over 10,000 users.

Slack: real-time collaboration becomes mainstream

Despite the introduction of Microsoft Teams on its turf, Slack remains confident in its strategy of creating tools that allow greater communication and productivity within companies.

"We find this offensive both flattering as well as intimidating, given Microsoft's means, but we think there is sufficient space in the market for several players", declared April Underwood, VP of Slack at the beginning of November.

A market that Slack has largely contributed to opening and driving, by introducing the concept of real-time collaboration. Its weapon? Agility, despite its still limited size and its proven and copied tools. Result: 4 million active users everyday and constant growth.

LinkedIn: from B2B marketing for... Microsoft

Microsoft Closes Acquisition of LinkedIn at the beginning of December. The transaction, which runs into billions of euros, has been followed closely by the European Commission. Despite a strong position in the business, mainly at a human resources level, LinkedIn needs 25 billion euros from Microsoft to pursue its offensive in the domain of professional tools, in a hugely competitive climate. For Microsoft, the acquisition will enable the company to reach B2B marketing targets such as recruitment agencies, head-hunters and businesses. To explain the synergy sought in simple terms, the CEO of Microsoft, Satya Nadella, gives the example of a meeting where everyone present sees their LinkedIn profile, linked to their invitation.

Discover More

Contact
Close

Contact

Complaints

We would like you to answer a few questions. This will help us answer your request faster and in a more appropriate manner. Thank you in advance.

Is your company/organisation client at BNP Paribas Fortis?

My organisation is being served by a Relationship Manager :

Your message

Thank you

Your message has been sent.

We will respond as soon as possible.

Back to the current page›
Top