CategoriesIBSi Blogs Uncategorized

The Danish startup putting the killing blow into key encryption technology

Danish encryption specialist Sepior, founded in 2014, was started on the back of ground-breaking encryption projects and the support of the EU’s Horizon 2020 programme. In discussion with IBS Intelligence it revealed that it has lots more surprises up its Fairisle jumper

Sepior’s big break came with the EU’s Horizon 2020 initiative, an irony not lost on CEO Ahmet Tuncay – as we spoke to him, the chaos which is Brexit continues to engulf Europe.

Ahmet Tuncay, Sepior CEO said: “Yes, we’re a truly Danish company and found our footing within the Horizon programme, which deals mostly with small to medium enterprise projects or SMEs.  For companies with promising technologies, the EU economic commission provides grants for the ones they believe will become a commercial success.  But there’s a fairly high bar for them to grant you this money, you have to commit to specific milestones and strict targets.  The commitment our founders of the company made was: ‘If you give us these funds and support, we’re going to create economic activity within the EU, which means hiring people and growing the company’.

He continued: “Our obligation was really to take that money and create a piece of commercially viable technology.  At the early stages, specific use cases aren’t as important as the foundational technology and broad market appeal.   Once the foundation is created,  we wanted to be able to acquire institutional funding to go and build a business.  In the long term our obligation is to create jobs, insofar as the EU is concerned, but now we have commitments to our shareholders, so it’s not just jobs that matter today.”

Tuncay says: “If you just look at the size of the market for encryption key management, you’re not going to be impressed by the number, it’s only around a $1 billion market.  But if you take the same technology, repurpose it and, apply it to commercial asset exchanges, which is a $300 billion market, and find a way to participate in a revenue sharing opportunity, you’ve moved yourself from a $1 billion market to a $300 billion market. You then have to figure out how to extract your fair share from that activity.”

The numbers are certainly impressive if you consider the amount of dollars that brokers and exchanges collect in fees – it’s a vast amount – it’s certainly more than the $1 billion market for encryption key management.  It’s several hundred billion dollars, it is super lucrative and it’s a great market to be in because few companies are good enough to offer a differentiated service to capture new customers..

Tuncay says: “Our investors recognised that the big pain of cryptocurrency activity is that if you lose the coins, they’re gone forever.  So that turns up the need for novel security solutions more than ever.  The digital wallet containing the cryptocurrency assets must be hosted in trusted custody and the transactions involving the wallet must be protected against malicious or incompetent brokers and clients. The need for a higher level of security means having multiple signatures and multiple approvers, which obviously more secure than having just one.  When you have the multiple approvers using our ThresholdSig technology versus a MultiSig or multiple signature technologies, we can deliver very high levels of security and trust along with some operational benefits for the exchange, because the administration of the security policies involving adding people, removing people, replacing lost devices, and who can participate in those signatures, that’s all done off-chain and it’s simple.”

The alternative approach is to use MultiSig, which is all on-chain, so when you change the policies you have to broadcast the policy, telling everyone who the approvers and policies are, which is not really good for security. You may also have to reissue or generate new keys.  There is a lot of administrative bureaucracy that goes with that approach.  Until recently MultiSig has been the gold-standard for threshold cryptographic currencies but ThresholdSig provides an equal or higher level of security with a more flexible, lower administrative effort environment and also has some potential efficiencies to improve and reduce the size of the recorded transaction on the blockchain.  That means that the way the transactions occur, they’re recorded on the ledger, and with MultiSig, the blocks actually contain multiple signatures that have signed off on the transaction, which of course increases the block sizes.

Tuncay says: “With ThresholdSig there’s only one signature that goes on the ledger, so it actually reduces the amount of data on the ledger. It turns out these signatures are a substantial portion of the total transaction size.  So, there’s this kind of tertiary benefit that could end up being quite material, because it means that the blocks can contain more transactions. Blocks are typically fixed in size, so if the transactions are smaller you get more of them onto the chain.  In some of the currencies, like Bitcoin, it’s already hitting capacity on processing.  So, if you can have the highest level of security and smaller transaction sizes it’s going to maximise throughput.”

There is the hope that ThresholdSig transactions will also have lower transaction fees than MultiSig. ThresholdSig transactions appear as a single signature transaction on the blockchain. Historically, single signature transactions are the smallest in size, allowing for maximum transactions per block and typically have the lowest mining transaction fees. Our expectation is that the exchange could end up with lower transaction fees, with higher security and lower administrative overhead. So, there are some very compelling reasons why this technology is going to be relevant to a far wider audience than up to now.

Sepior’s investors were on the front edge of recognising threshold schemes, the cryptography approach with multiparty computation, and how that technology could bring real benefits in this use case.  As Tuncay says: “We’re focusing on the implementation around cryptocurrency exchanges and hot wallets, but this technology is applicable to a much wider range of applications.  So next month we’re going to be making some announcements around more blockchain generic solutions, to provide more privacy on private blockchains in general. There are a whole series of problems with using distributed ledger technology for business and one of these is scalability.  How do you support – for example, in the case of logistics tracking operations,  a container being loaded and shipped from a point in China to destination in Los Angeles? Sometimes there are 35 or 40 different parties involved in that transaction.  These parties don’t necessarily need to know everything on the blockchain.  Effectively all the transactions are on the chain.  So all parties that are participating in the chain can validate and see their own transactions but need not see the confidential data of other parties.  One strategy for this has been to create virtual blockchains called channels, which is used in Hyperledger fabric, but it’s use creates a messy scalability problem.

Tuncay says: “If I were to generalise it further, while a blockchain is supposed to contain transactions that are immutable because everybody on the chain can validate them, the downside is that everybody on the chain can also see everything on the chain.  So how do you create an application like logistics tracking where there are 30 parties on the chain and you want every party to have a different view of it?  Our solution to this – and there are existing solutions which have proven to be unscalable, is based on access control policy that relies on encryption to make only the intended parts of the chain available to users based on their permissions.

“There is nothing magical about this, we’re just using our underlying key management system and fabric.  But once we make this available, it will also enable the creation of privacy-preserving chains that are massively scalable than what is possible today.  We think there’s value there, again this is something that we’re going to go and test out and we’re involved in activity with several large companies, to validate this.  We think that it’s worthwhile.:

Fundamentally Sepior is providing fine-grained control over who has visibility to what on the blockchain.  The key words here are ‘threshold cryptography’.  Sepior is pioneering and leading the industry in the field of threshold cryptography, to apply these key management concepts in a manner that’s more scalable and works in distributed environments with a high degree of efficiency.  Part of the threshold aspect, the threshold cryptography, in the case of a crypto wallet is that you might have four parties who are available to approve a transaction, but you might have a threshold that says if any three are available it will be accepted as a valid transaction.  Therefore, you can define a threshold so that if somebody loses their phone or their device gets hacked and we no longer want to trust it, it can be excluded but continue to transact and do business.

Tuncay says: “When you move into the blockchain application the threshold aspect is more around signing key availability and management. What we’ve done here is to take the key management function and distribute it using multi-party computation (MPC).  We’re able to distribute the key generation and management functions across multiple virtual servers, if you will, in the cloud, such that no individual server has a full key that could be hacked or stolen.  But collectively maybe two out of three of these virtual servers can provide keys for all the users that require access to the content on that blockchain.  This threshold aspect gives a high degree of availability, reliability and integrity of both the encryption and the availability of key management.”

For this Danish company, it looks like blockchain will be The Killing it deserves.

CategoriesIBSi Blogs Uncategorized

The spreadsheet challenge as banks move processes to European financial centres in preparation for Brexit

Henry Umney, CEO, ClusterSeven

Uncertainty around Brexit continues, but practical preparations have begun – many banks are now well in the throes of duplicating or moving systems and business processes from London to other financial hubs.

Extricating processes isn’t going to be an easy task. There are two aspects to this separation process – formal IT supported enterprise systems and the grey IT (or end user supported IT systems). Most banks have the understanding and the ability to effectively disentangle the core enterprise systems. Where in this extrication activity, banks are likely to come unstuck is situations wherever there are end user supported IT, commonly Microsoft Excel spreadsheet-based processes, that are deeply linked with the rest of the banking group’s enterprise systems.

If a bank is required to set up a separate entity in the UK, all the data residing in ancillary spreadsheets that feed data into the various systems pertaining to this jurisdiction will need to be delinked/duplicated and housed separately too. For instance, as banks separate their Treasury operations, there will likely be certain processes that heavily rely on common Bloomberg and Reuters market feeds that are owned by or have deep linkages to the banking group’s systems. Similar issues will arise for capital modelling-related processes. While previously a bank might be evaluating business risk based on its aggregated position across its European operation, post-Brexit, determining the UK entity’s risk position will require the financial institution to disconnect and separate the relevant data for this jurisdiction.

Essentially, as banks duplicate their enterprise systems for specific jurisdictions, they need to do the same for the spreadsheet-based application landscape that they rely on operationally.

Disentangling these unstructured, but business-critical processes manually will prove impossible and eye-wateringly costly.  Typically, spreadsheets surround the core systems such as accounting, risk management, trading, compliance, tax and more. Complete visibility of the spreadsheet-based processes landscape is essential to identify the ones that need to be duplicated/extricated for the new entity, but due to the uncontrolled nature of spreadsheet usage, there will potentially be 1,000s of such interconnected applications and no inventory of these processes.

Banks should consider adopting an automated approach to safely extricating their spreadsheet-based processes. Spreadsheet management technologies, can scan and inventory the entire spreadsheet landscape, based on very specific complexity rules and criteria. The technology can expose the data lineages of individual files across the spreadsheet environment to accurately reveal the data sources and relationships between the applications.

This approach is already proven in M&A type operational transformation situation, which to some extent resemble the Brexit scenario. Aberdeen Asset Management adopted this approach to separate the Scottish Widows Investment Partnership (SWIP) when it bought the business from Lloyds Banking Group. Due to the number of convolutedly connected spreadsheets across the vast spreadsheet landscape and the complexities of the business processes residing in this environment at SWIP, manually understanding the lay of the land was unfeasible. Utilising spreadsheet management technology, SWIP inventoried the spreadsheet landscape, identify the business-critical processes, and pinpointed the files that required remediation. Simultaneously, the technology helped expose the data lineage for all the individual files, revealing their data sources and relationships with other spreadsheets. SWIP was able to securely migrate the relevant business processes to Aberdeen Asset Management and where necessary decommissioned the redundant processes.

Post-Brexit too, banks have a lot more to gain from automated spreadsheet management.  Spreadsheets will likely be the used to set up temporary business processes/solutions for the new operations. Spreadsheet management will embed best practice-led use of these tools across the lifecycle of such applications – from creation through to remediation and decommissioning into formal IT supported applications– encompassing spreadsheets and their unique data flows. It will also offer banks an in-depth understanding of their data landscape. This will help institute data controls and spreadsheet change management processes so that there is complete transparency and an audit trail tangibly reducing operational, financial and regulatory risks caused by spreadsheet error.

 

By Henry Umney, CEO, ClusterSeven

 

About the author

Henry Umney is CEO of ClusterSeven. He joined the company in 2006 and for over 10 years was responsible for the commercial operations of ClusterSeven, overseeing globally all Sales and Client activity as well as Partner engagements. In July 2017, he was appointed CEO and is strongly positioned to take the business forward. 

CategoriesIBSi Blogs Uncategorized

Accenture to enhance core banking platform with SEC Servizi acquisition

Accenture has completed its acquisition of Italian banking technology service provider, SEC Servizi Spa from the Intesa Sanpaolo Group. Accenture now has 80.8% ownership in SEC Servizi and will also be acquiring the remaining interests held by other shareholders.

Established in 1972, SEC Servizi is a consortium formed by Italian banks to provide IT services and outsourcing solutions for banks and other financial institutions in the country. Its offerings include application and facility management, centralized back office services and specialized multi-channel, consulting, education and support solutions. The company reportedly manages more than 21 million transactions per day for nearly 1,400 bank branches in Italy and had revenues of EUR 152 million by the end of 2017. Some of its clients include Banca di Credito Popolare, Banca Italo Romena, Banca Nuova, Veneto Banca, Allianz Bank Financial Advisors, others. Intesa Sanpaolo acquired SEC Servizi in 2017 as part of the acquisition of certain assets, liabilities and legal relationships of Banca Popolare di Vicenza S.p.A. and Veneto Banca S.p.A, both in compulsory administrative liquidation.

The acquisition of SEC Servizi’s expertise and technology and operational assets will enable Accenture to create an advanced and innovative core banking platform that can support banks in their transition to digital. This transaction will help to establish Accenture as a leader in the banking technology market in Italy, serving SEC Servizi Spa’s existing customers, including Intesa Sanpaolo and other mid-sized financial institutions in Italy.

After slowly recovering from the financial crisis, Italian banks are now looking at modernizing their technology infrastructure and are increasingly relying on digital resources to remain competitive in the market and align their services to the digital savvy customer. An indication of this is the drop in the number
of branches at the end of 2017 which was was 20 per cent lower than in 2008.  Banks such as Unicredit, Intesa Sanpaolo, Monte dei Paschi, Mediobanca, Banca Carige are leading the way with digitalization initiatives ranging from contactless payments, virtual reality branches, robo advisory service, etc.

For Accenture, this presents an opportune time to enhance its core banking technology services with the acquisition of SEC Servizi Spa.

CategoriesIBSi Blogs Uncategorized

A GDPR storm is coming – are you prepared?

Julian Saunders, CEO and founder, PORT.im,

Julian Saunders, CEO and founder of personal data governance company PORT.im, discusses how alleged breaches of GDPR by Facebook and Twitter may just be the beginning

Cast your mind back to early 2018. The world was alive with the sound of GDPR commentary. In the run-up to the May compliance deadline, everything was up for debate. Would it spell the end of marketing as we know it? Was anyone actually compliant? Was it good news or bad news for businesses? And, getting the most airtime – would GDPR be a damp squib like the Cookie Directive?

If you were of the opinion GDPR was a lot of hot air, the intervening months may feel like vindication. GDPR has largely gone off the agenda of most media publications and with it the minds of many business owners. However, we’re merely in the eye of the storm. In the last few weeks Facebook, and now Twitter, have been squarely in the crosshairs of regulators for allegedly failing to comply with GDPR. The EU has issued a stark warning that big fines will be handed down before the end of the year. Similarly, the ICO has ramped up its warnings that major action is likely to be taken. Added to this momentum has been a seemingly endless series of high-profile data breaches with Google+ the latest casualty.

For business owners who put their GDPR compliance on the backburner since May, the warnings could not be clearer: If you aren’t GDPR compliant you’re likely to be in some serious trouble in the next few months.

Facebook has quickly become the poster boy for poor data governance procedures. Cambridge Analytica, data breaches, and GDPR failures have all come in quick succession and provide a case study for businesses on how not to collect and manage data. While it may be tempting to revel in some schadenfreude, a better approach is to see what every business can learn from Facebook and how they can protect themselves from the expected GDPR storm.

First, it should go without saying that financial organisations hold some of the most sensitive personal data. Thankfully, the most important data linked to account information has largely been well protected. However, having high security standards around bank accounts can breed complacency especially when you consider it’s not the only information the average financial company holds. The marketing, customer service and sales departments will all, usually, have their own customer databases which may be subject to vastly different security and governance standards. A breach related to any of this data could be fatal to a financial organisation and result in hefty GDPR fines.

General complacency is kryptonite for data management and protection. For Facebook, its complacency manifested itself in lax standards, questionable practices and a belief it would never be brought to account. For financial organisations, it can lead to blind spots related to data that is deemed less ‘sensitive’. Often, to enable smooth marketing, client management and sales operations, customer data is more readily accessible than financial information, shared with more parties, updated more frequently and inputted into more platforms. Each of these processes increases risk. Compounding this issue is a general lack of education related to the power of this data to do harm. Many would ask, what use is an email address to a hacker? The short answer is, a lot. This is why GDPR seeks to protect every piece of personal data.

If you’ve got to this point in this article and you’re beginning to feel some doubt surrounding your data practices – good. Now is the perfect time to audit and review all your data processes and security standards. The baseline should be – is everything GDPR compliant? If it was in May – is it still compliant? New technology, teams and initiatives can all impact your data processes and result in non-compliance.

If you avoided all of this in the faint hope that GDPR wasn’t going to be an issue, you need to get on it immediately. In this instance, buying in technology and availing yourself of the services of specialist consultants will be the fastest (but not the cheapest) option.

Next, what is the general understanding of your staff? All the procedures and technological safeguards will mean nothing if your colleagues do not understand what GDPR is and the danger of data breaches. Undertaking company-wide training regularly and incorporating data management expertise and ethics into staff development and assessment can be a powerful way to measure and improve education.

Finally, if the worst happens and there’s a breach – are you prepared? Time and again we see that a poorly handled response to the data breach generally do more damage than the breach itself. Again – I’ll point to Facebook and its slow, incomplete and unsatisfactory responses to each and every data issue it has encountered.

Slow responses are symptomatic of a failure to have the right procedures in place. This can be because there is no technology or expertise available to identify the breach in the first instance or the right people are not empowered to make quick decisions. You need to start from the position that any breach, no matter how minor it appears, is serious. It should be reported to a specialist team led by the CEO. Within that team should be the IT lead, marketing, customer service and legal. Consumers should be informed as quickly as possible, both to be GDPR compliant, and to reassure. The business needs to identify who is impacted, how, what went wrong, how it can be fixed and how consumers will be protected in the future. The faster these boxes are ticked and communicated the better the end result – especially if the ICO gets involved. As with anything, practice makes perfect. Conducting wargames and drawing up ideal responses and contingencies with this team could make all the difference.

We now live in a world where the reputation and future of a company can be destroyed by hacks and data breaches. Organisations are generally to blame for this environment. There has long been a culture that personal data is a commodity that businesses can deal with as they wish. Now the wheel has turned. If you’re one of the many business owners that still believe that data governance is just something for the IT department to worry about – you’re going to be in for a big surprise. By the end of the year, a number of large businesses will be hit with near-fatal fines as a warning to other companies. Acting now will ensure that your company is not one of these cautionary tales.

CategoriesIBSi Blogs Uncategorized

Assuring good customer outcomes in a digital world – the five key risks of digital

James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

Online banking is fast becoming the norm and brings with it many benefits. However, it is not without risk. How can firms ensure that customers are being best served by these new ways of transacting?

Digital banking has many benefits. For customers, they can instantly manage their finances from any location using an ‘always-on’ service. For firms, they can scale, gain reach, save cost, capture data more easily and build loyalty. However digital is not without risk. The same risks of mis-selling, poor servicing and inadequate complaint management are still present; albeit in different ways. Control frameworks need to be in tune with these new ways of interacting with customers. The FCA is clear that good customer outcomes should always result, regardless of the channel.

Research from Forrester indicates that rather than undergo a re-design, many products have simply been migrated online. Products designed for sale in branch or by telephone may not be suited to online. Digital demands an alternative way of thinking. When using an electronic interface, customers behave differently than when talking to an adviser. Natural cognitive biases go unchecked and people may be prone to making rushed and less optimal decisions.

Digital readiness demands more

Provider side, digital typically tends toward a pre-occupation with optimising conversion rates. Less attention can be given to end-to-end service design and compliance. Digital readiness means more than having high performing front-end interfaces. It also demands the right back-end processes, policies and controls. Without this good customer outcomes can easily be compromised.

As with most sectors, omnichannel experiences are standard. Customers will switch from one channel to another throughout their journey and firms need to ensure continuity. Typically, this demands good CRM processes so that customers are treated consistently and appropriately at all touchpoints.

Given the risks inherent with digital, a thorough testing programme is recommended. This provides assurance that each channel is working; and where not gives the insight needed to put things right.

The five key risks of digital

Risks in digital may manifest in different ways to other channels. Here are the most critical areas where good customer outcomes need to be assured.

  1. Buying the right product

Without an advisor to carry out a thorough needs assessment, and then recommend products, customers may select products that are not best suited. Online journeys need to guide customers through a process that is easy to follow and provides them with a good match to their needs and circumstances.

  1. Disclosure

Effective disclosure is particularly problematic in digital journeys. Customers may overlook important

information and be prone to over-confidence in financial decision making. It is important that digital journeys provide clear, unambiguous and impartial information. Firms need to be sure that customers fully understand the risks, and this understanding needs to be complicit and tested.

  1. Decision making

The data that customers provide needs to be adequate, appropriate and verified. In addition, the decision-making processes used need to be made clear. This is so customers understand how their information is being used and the terms by which they have been approved, or denied, at any stage.

  1. Product servicing

During the life of the product, service must be effective. Documentation, account servicing, complaints, cancellations and renewals all need to be readily available and compliant. There also needs to be integration with other channels, so where need be, customers can rely on human advice to help them achieve good outcomes.

  1. Vulnerable customers

Firms need to ensure that vulnerable customers are supported and neither disadvantaged or marginalised by digital. Some are unable to access online services, or to use them effectively. The same levels of service must be available offline, either for the whole or part of the customer journey. In addition, firms need to consider how vulnerability is identified in an online environment and then provide appropriate treatment to ensure good outcomes.

Technology vs. humans in a digital world

The industry is already speculating on how technology can be used to improve compliance. The first steps are simply to optimise existing sources of data so that it can be used for analysing compliance performance. More sophisticated approaches, such as applying voice recognition and semantic technology, will only be a matter of time. However, humans are far from redundant in this.

Humans can spot patterns and anomalies in ways that have not yet been coded, and humans are also capable of moral and ethical judgements that machines are not. Machines also need to be taught, calibrated and checked, a task that needs ‘real’ input and intervention.

FCA concerns over robo-advice shows that we may have gone too far in putting all parts of a process to machines. Instead, a balance is needed that incorporates the best of technology and the best of people.

For the time being, at least, people still have a place in ensuring good customer outcomes.

By James Nethercott, Group Head of Marketing at Regulatory Finance Solutions

CategoriesIBSi Blogs Uncategorized

Visa outage highlights IT maintenance challenges – and the promise of predictability

Evan Kenty, Managing Director EMEA, Park Place Technologies

In June, Visa started rejecting one in 10 financial transactions across the U.K. and Europe – a problem lasting 10 hours and affecting 1.7 million cardholders. Even in an IT environment designed to support 24,000 transactions per second, a hardware failure crashed the system. The incident was a wake-up call for an industry reluctant to suspend services for scheduled, expensive repairs. Could predictive maintenance have prevented the crisis?

Predictive maintenance draws on machine learning, neural networking, and artificial intelligence. Commonly used in marketing, learning technologies improve with use: every time you search Google, its accuracy improves.

Yet while AI can predict preference, it is still learning how to factor in context. Nirvana for marketers will be when technology shows my car purchase is followed by a caffeine urge, with my coffee advertised accordingly. It’s the search for the unforeseeable yet real relationship that can only be found with a deep data dive. We’re not there yet, but we’re on the way.

Maintenance that informs itself

The same neural networking technologies are being applied to hardware and networks. There is countless data in a data centre. Just as marketers want to utilise all the information available, so do data centre managers. The promise in machine learning is the ability to examine the full range of performance data in real-time to detect patterns indicative of “faults-in-the-making”, uncovering relationships no human engineer would return, like cars and caffeine.

This application of AI algorithms to data centre maintenance underpins our ParkView advanced monitoring system, which contextualises patterns to “understand” infrastructure behaviours. This means instant fault identification and fewer false alarms. Future predictive systems will prevent the types of issues Visa experienced.

The next stage: predictive maintenance taps IoT

In the Tom Cruise sci-fi movie, Minority Report, police use “psychic technology” to prevent crimes before they happen. The twist comes when the crime-solver is accused of the future murder of a man he hasn’t yet met.

There is a parallel with data centres. Human error causes an estimated 75 percent of downtime. That’s why data centres are less populated. The perimeter has security staff, but the interiors are becoming vast and lonely server expanses, where the electric hum is rarely broken by the sound of footsteps. The downside is the lack of human detection of things like temperature changes and dripping water.

That’s where the IoT and the Industry 4.0 playbook developed in heavy industry comes in, in which remote monitoring enables smart and predictive maintenance. A good example here is fixing a data centre air-conditioning system based on its predicted performance in relation to it’s surrounding environment. This concept can be applied across the entirety of a data centre and its cooling, power, networking, compute, storage, and other equipment. Emerging dynamic and largely automated predictive maintenance management will transform the data centres we know today into self-monitoring, self-healing technology hubs, enabling reliability as we move computers to the edge to support the IoT applications of tomorrow.

Evidence indicates a move from a reactive/corrective stance, still dominant in many data centres, to more preventative maintenance delivering average savings of up to 18%. The next leap towards predictive maintenance drops spending about 12% further. In fact, Google used such strategies to drive a 15% drop in overall energy overhead.

Combating downtime with predictive technology

Enterprises must integrate predictive maintenance. Downtime kills reputations, profits, and customer relationships. Most organisations like Visa can recover from unplanned outages, but reducing unscheduled maintenance is always preferable.

IT leaders must make hardware and facilities as downtime-proof as possible. This means using machine learning and AI to return a pound of ROI on every ounce of prevention possible. Banks are investing in AI for a range of purposes, from contract scanning to fighting fraud. It’s essential that the new technology is used to fix problems in advance.

By Evan Kenty, Managing Director EMEA, Park Place Technologies

CategoriesIBSi Blogs Uncategorized

Sit tight, modern APIs will soon take banks on a fast ride  

Hans Tesselaar, BIAN

The world of banking today is like a race car on the grid preparing for the inevitable green light. There is a lot of noise before the ‘go’ signal; from the vehicles revving their engines, pundits in commentary boxes speculating on the race outcome, and spectators cheering on favourites from the grandstands. When the chequered flag drops and the race begins, a plume of dust and smoke is left behind as the vehicles speed off across the track. The winner is yet to be decided… 

In banking, the race is just starting. Amidst the noise, speculation and fanfare, success in this industry will come down to one key thing: open APIs. Those that can harness them correctly will take the top spot on the podium. 

Shifting up a gear 

Modernisation in retail banking is largely being driven by customers, who have come to expect a level of digitalisation consistent with what they experience in other areas of their lives. Simply compare well-known consumer tech innovations such as the Amazon Echo, or Google’s impressive AI-enabled search function, to understand why people expect more from those who handle their money.  

This is not to say banks have neglected innovation. Flashier, more convenient services for customers have been introduced. But in the face of ongoing political, legacy, technological, competitive and regulatory challenges, the ‘from scratch’ development of advanced Google or Amazon-style services remains an uphill struggle.  

Even in light of the recent technological advancements permitted by open banking, the issues outlined above have prevented many banks from properly grasping the opportunities of technology and the disintermediation of data.  

Opening the throttle 

Open banking is accelerating the banking industry into the future, with APIs acting as the fuel to power the innovation ahead. But successful development and implementation of API-based technology is a long-winded and costly task for banks to undertake alone. To combat this, some banks have started acquiring fintech businesses to quickly bolster their own service offerings. However, for maximum benefit, industry-wide collaboration around innovation is needed. 

This will require banks to shift from a historically closed-off, competitive mentality, to recognising the advantages of pooling knowledge and raising standards of industry innovation together. BIAN, the organisation that I am proud to head up, has spent a decade promoting this ideology. Our global organisation brings together some of the biggest, most innovative banks and technology vendors, to build a common IT architecture or ‘how-to guide’ to streamline the inevitable move to modern, high-quality, and customer oriented services. 

A large part of how to create a modern IT architecture for banks involves utilising a library of definitions for popular APIs, to avoid unnecessary duplication of time, money and effort. BIAN’s current banking architecture contains 26 new API definitions, including ones that instruct banks how to build automated customer on-boarding processes. These API definitions comply with the SWIFT ISO20022 open banking standardisation approach, making them universally compatible. 

Miles ahead 

Adopting a common IT framework would allow the banking industry to launch services faster, and better meet customer demands for smarter and more transparent services. As time goes on, more complex API functionalities will be built, allowing banks to not just incorporate more exciting services into their offering (e.g. WhatsApp payment), but also establish novel ways to maximise new and previously untapped revenue streams. Naturally, modern and streamlined services can reduce operational costs by eliminating outdated back and middle-office processes.  

Looking ahead, the next phase of API development will focus on ‘micro-services’ – that is, API first banking capabilities which run independently from core banking systems. Microservices will provision banks to facilitate a “pick-and-mix” approach to their offerings, allowing them to be more aligned to their customer base. In time, such a model could renew the core banking system and change the banking IT function forever. 

First place 

The introduction of a common IT framework will be of massive benefit to the banking industry, helping major players to address customers’ demands for modern banking solutions in a more effective manner. As the introduction of higher standards for global banking services grows, the industry will eventually move away from competing on service offerings to competing on brand value. Like we have seen in the retail industry, the winners in banking will be those that provide the right mix of innovative offerings as well as premium customer service.  

By Hans Tesselaar, Executive Director at BIAN 

CategoriesIBSi Blogs Uncategorized

E-invoicing: How digital networks are helping to eradicate decade old processes

Chris Rauen, Senior Manager, Solutions Marketing at SAP Ariba

If you have an electronic invoice system that just about meets the needs of the accounts team, but operates in complete isolation from the rest of the company, is that a system that provides much value?

It might do — if you’re doing business in the 1990s. Since then, a plethora of electronic invoicing systems have entered a crowded marketplace, all looking to streamline the complex way of processing invoices globally.

In today’s digital economy, new business value comes from linking invoice data to contracts, purchase orders, service entry sheets, and goods receipt for automated matching. Furthermore, automation of the invoice management process must extend beyond enterprise operations to include suppliers. Yet few platforms enable this. By treating accounts payable as a department, many e-invoice systems fall short of their potential.

So, how can linking electronic invoicing with a company’s other operational systems, and to suppliers, unlock this value? It turns out that an interconnected approach to invoice management in a digital age reduces costly errors, strengthens compliance, and facilitates collaboration both within the organisation and among trading partners.

A cloud-based network can assess trading partners against hundreds of criteria, including whether they can root out forced labour from their supply chain to how well they document the use of natural resources, and even giving work to minority suppliers. Of course, while software alone cannot ensure compliance with the ever-changing policies that continue to come into effect, it remains a powerful tool towards efforts in achieving it. Compliance, once a tedious task, now can be managed from a dashboard.

To reduce invoice errors effectively, a digital network must rely on intelligence — not just the human kind, but through smart invoicing rules that are essential to a business network. These rules effectively validate invoices before posting for payment to streamline processing, reduce operating costs, lower overpayment and fraud risk, and maximise opportunities for early payment discounts.

By enabling real-time collaboration between buyers and suppliers, digital networks not only bridge the information gap that can delay invoice processing, but they also reduce the complexity often associated with compliance. That includes effectively screening suppliers and monitoring business policies automatically before a transaction takes place.

However, perhaps the greatest advantage of digital networks is collaboration. Issuing an invoice, even when accurate and on-time, can sometimes be a one-way, asynchronous conversation. A buyer receives an agreed-upon product or service from a supplier, who at a later date sends out an invoice and, at an even later date, receives payment. This scenario has been the same for decades. But digital networks challenges that. The immediacy of network communications begs the question: Should electronic invoicing merely replicate the age-old process that postal mail once facilitated? Or shall it improve upon it?

We continue to see chief procurement officers choosing the latter. Through their day-to-day experience with digital networks, they have come to view invoice processing as just one part of the wider exchange of information among trading partners. An electronic invoice reflects a snapshot of the multi-party collaboration that networks enable, and — through intelligent business rules — alerts of potential errors or exceptions relating to the transaction. As we move forward in the digital age, and buyers and suppliers extend their relationship to include product design, innovation and product delivery, they are able to expand the scope of electronic invoicing to capture up-to-the-minute progress reports on the teamwork within and across organisations.

Ultimately, your electronic invoicing system shouldn’t focus only on accounts payable, it should give open visibility onto the rest of your operations and even who you do business with – so that mutual growth can be achieved and positive collaboration can flourish.

The author is Chris Rauen, Senior Manager, Solutions Marketing at SAP Ariba, the company behind the world’s largest business network, linking together buyers and suppliers from more than 3.4 million companies in 190 countries

CategoriesIBSi Blogs Uncategorized

The Death of the PIN

David Orme, SVP, IDEX

Personal identification numbers (PINs) are everywhere. These numeric versions of the password have been at the heart of data security for decades, but time moves on and it is becoming evident that the PIN is no longer fit for purpose. It is too insecure and is leaving consumers exposed to fraud. 

Why bin the PIN?

In a world that is increasingly reliant on technology to complete even the most security-sensitive tasks, PIN usage is ludicrously insecure. People do silly things with their PINs; they write them down (often on the back of the very card they are supposed to protect), share them and use predictable number combinations (such as birth or wedding dates) that can easily be discovered via social media or other means. And this is entirely understandable: PINs must be both memorable and obscure, unforgettable to the owner but difficult for others to work out. This puts PIN users — all of us, basically — between the proverbial rock and a hard place.

Previous research has shown that when people were asked about their bank card usage, more than half (53%) shared their PIN with another person, 34% of those who used a PIN for more than one application used the same PIN for all of them and more than a third (34%) of respondents used their banking PIN for unrelated purposes, such as voicemail codes and internet passwords, as well. In the same study, not only survey respondents but also leaked and aggregated PIN data from other sources revealed that the use of dates as PINs is astonishingly common1.

But if the PIN has had its day, what are we going to replace it with?

Biometrics

Biometrics may seem to be the obvious response to this problem: fingerprint sensors, iris recognition and voice recognition have all been rolled out in various contexts, including financial services, over the past decade or so and have worked extremely well. In fact, wherever security is absolutely crucial, you are almost certain to find a biometric sensor — passports, government ID and telephone banking are all applications in which biometric authentication has proven highly successful.

However, PINs are used to authenticate any credit or debit card transaction, and therein lies the problem. For biometric authentication to work, there has to be a correct (reference) version of the voice, iris or fingerprint stored, and this requires a sensor.

It is one thing to build a sensor into a smartphone or door lock, but quite another to attach it to a flexible plastic payment card. Add to that the fact that cards are routinely left in handbags or pockets and used day in and day out, and it becomes clear why the search for a flexible, lightweight, but resilient, fingerprint sensor that is also straightforward enough for the general public to use, has been the holy grail of payment card security for quite some time.

Another key advantage of fingerprint sensors for payment cards is that the security data is much less easy to hack, particularly from remote locations, than is the case with PINs. Not only are fingerprints very difficult to forge, once registered they are only recorded on the card and not kept in a central data repository in the way that PINs often are – making them inaccessible to anyone who is not physically present with the card. In short, they cannot be ‘hacked’.

Your newly flexible friend

Fortunately, the long-held ambition to add biometrics to cashless transactions has now been achieved, with the production and trials of an extremely thin, flexible and durable fingerprint sensor suitable for use with payment cards. The level of technology that has been developed behind the sensor makes it very straightforward for the user to record their fingerprint; the reference fingerprint can easily be uploaded to the card by the user, at home, and once that is done they can use the card over existing secure payment infrastructures — including both chip and ID and contactless card readers — in the usual way.

Once it is registered and in use, the resolution of the sensor and the quality of image handling is so great that it can recognise prints from wet or dry fingers and knows the difference between the fingerprint and image ‘noise’ (smears, smudging etc.) that is often found alongside fingerprints. The result is a very flexible, durable sensor that provides fast and accurate authentication.

The PIN is dead, long live the sensor

Trials of payment cards using fingerprint sensor technology are now complete or underway in multiple markets, including Bulgaria, the US, Mexico, Cyprus, Japan, the Middle East and South Africa. Financial giants including Visa and Mastercard have already expressed their commitment to biometric cards with fingerprint sensors, and some are set to begin roll-out from the latter half of2018. Mastercard, in particular, has specified remote enrollment as a ‘must have’ on its biometric cards, not only for user convenience but also as means to ensure that biometrics replace the PIN swiftly, easily and in large volumes2.

And so, with the biometric card revolution now well underway, it is time to say farewell to the PIN (if customers can still remember it t, that is) and look forward to an upsurge in biometric payment card adoption in the very near future. Our financial futures, it seems, are at our fingertips.

 

By Dave Orme, SVP, IDEX Biometrics

 

References

1 Bonneau J, Preibusch S and Anderson R. A birthday present every eleven wallets? The security of customer-chosen banking PINs: https://www.cl.cam.ac.uk/~rja14/Papers/BPA12-FC-banking_pin_security.pdf

2 Mastercard announces remote enrolment on biometric credit cards: https://mobileidworld.com/mastercard-remote-enrollment-biometric-credit-cards-905021/

 

CategoriesIBSi Blogs Uncategorized

BofE rate rise: the unintended trading cost consequences for banks

Kerril Burke, CEO of Meritsoft

Does anyone long for a return to more benign economic times? A time when a rise in the base rate simply led to immediate benefits for savers. Well, get prepared for a continued long wait, as last week’s decision from the Bank of England’s (BofE) signals anything but a move to more conventional times.

In fact, this rise, albeit small, has much wider knock-on effects than simply “what does this mean for my mortgage repayments”? Similarly, it obviously increases the costs for anyone trading the capital markets in terms of funding. Even with interest rates at historically low levels, some of the biggest players have been losing double digit millions in unrecovered failed funding costs. And with more hikes down the road, there are further implications of the BofE rate increase for the cost of trading.

As of last Thursday, the cost of the fail funding of trades in Sterling shot up 50%. Therefore, any trader looking to borrow say one million to finance a trade now faces an extra 0.25% per annum in funding costs. One of the main strategies traders use to minimise funding is by buying and selling for the same contractual settlement date. This means paying funds from the proceeds received from a transaction. Take the example of a trader selling Sainsbury’s stock in order to fund a purchase of Tesco shares, both for the same agreed settlement date. The trader expects the cash from Sainsbury’s trade in order to settle the Tesco transaction. There is just one small issue – he hasn’t received the money for his stake in Sainsbury’s. In this, let’s face it not untypical scenario, the only way to pay for the Tesco shares is to borrow the money. The trader in question, now has to take on an additional funding cost to borrow the funds to settle the Tesco trade. If the reason for the fail in the Sainsbury shares was due to the counterparty, it does not seem fair that they are forced to pay this additional cost does it?

Market sentiment

But hey, perhaps it doesn’t cost much? The cost will obviously vary based on the amount of cash open and the length it is outstanding but it could run into USD thousands per trade! And the major trading firms can have thousands of securities, FX, equity and commodity derivatives fails everyday. This may have been hidden because rates have been and are largely still at record lows. But the trend and market sentiment is now unmistakably upwards. However, this is only part of the problem.

There are costs and capital for market participants in the wide range of receivables on their balance sheet. These balances, at least the ones in Sterling, are now half a percent more expensive to fund. So the cost of failing to settle these transactions are now far more than they would have been before the hike. A bank is now at a distinct disadvantage, particularly if they do not have a way to identify, optimise and recover where they are incurring funding and capital costs through no fault of their own. Essentially, by having receivable items open while waiting for money to come in, it will be borrowing cash to cover itself. If a trade fails to settle for say five days, then that is a whole week of extra funding costs that a bank needs to cough up. And not being able to track additional funding costs due to the late settlements is not the only issue. Many banks are still not even identifying the direct cost impact of a trade actually failing. If a bank can’t work out the cost implications of not receiving funds when a trade fails, then how on earth can they identify whether or not they can claim money back from their counterparties?

Trying to work out the many effects of the BofE’s latest monetary policy decision is difficult, but like those with a variable mortgage, trading desks are impacted. Late settlement means higher funding and higher rates means the additional funding costs more. Preparing now to handle the trading cost impact of this small rise and the upwards trend is exactly what’s needed to ensure banks are ahead of the curve whenever the BofE or other countries decide to hike rates again in the future.

By Kerril Burke, CEO of Meritsoft

 

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube