top of page

Under the hood with Gary Collyer: how TradeSpeed builds its document examination checks

How one of trade finance's most experienced practitioners turned decades of document examination expertise into the knowledge and logic behind TradeSpeed's automated checks.


By Ezequiel Djeredjian




Gary Collyer's name comes up in most conversations about letters of credit. As the former chair of the UCP 600 drafting group, Senior Technical Advisor to the ICC, and a veteran of LC departments at HSBC, Citibank, and ABN AMRO, he has spent a career shaping the rules that govern trade finance globally. Four years ago, he joined Complidata with one goal: translate a lifetime of document examination expertise into system logic for TradeSpeed, Complidata's AI-powered trade finance platform.


We sat down with Gary to understand how that translation works, what it takes to build checks that a bank can trust, and how his understanding of what TradeSpeed can and should do has evolved across time.



Ezequiel: Thank you for your time, Gary. I'm really looking forward to this conversation. Before we get into the nuts and bolts of document examination and AI, let's start with a short intro. Your name is well known across the trade finance sector, but for anyone who might not be familiar with your story, can you share your experience and involvement in trade finance, and in particular in the development of the frameworks and rules currently used in the industry?

Gary: Sure. I started my career with what was Midland Bank, which later merged with HSBC. When I started in the letter of credit department in London, there were 265 people dealing with LCs, which you will rarely see today in a bank in London. From there I progressed to manager of the LC department for HSBC in the UK, then moved on to Citibank as a global product head with responsibility for all trade finance products, front and back-end systems. Then the same for ABN AMRO. After that, I set up my own consultancy company.

In amongst all that, I became involved with the International Chamber of Commerce, as the UK representative. I was then appointed as the ICC's Senior Technical Advisor. I was the chair for the UCP 600 drafting group, the chair and co-chair for the ISBP revisions, and co-chair for the eUCP, the electronic supplement to the UCP. So my career has been at a bank, corporate, as well as international level.

Four years ago I was asked by Complidata if I would be willing to act as a consultant. They'd been approached by one of their client banks to develop a new product in trade finance and they recommended, quite vehemently if I might say, to bring me in as part of the team. So I completed a number of interviews and discussions with the Complidata team, and eventually they convinced me to join.


Ezequiel: When you joined Complidata to help build TradeSpeed's checks, what did you expect the work to look like, and what surprised you?

Gary: I wasn't too sure about it at the time. Four years ago AI wasn't on everybody's lips. It was a complete unknown. I come from an environment where we had talked about electronifying documents, but no one had actually broached the subject of having a machine check the documents. At the beginning it was more of, "Is this for real? Is this something that's possible?" It just didn't seem to ring true as to what could be possible.

When I became involved, Complidata at the time was more of a financial crime company. They'd developed checks for screening and AML red flags. It was from that, that their bank client said, "Look, if you want to move into another area, the area will be document examination under LCs, and under collections, guarantees, etc."

So my first thought was, "Right, let's see what's under the hood." They sent me loads of details to review, I looked through it, I thought, "Yeah, it sounds alright, but you can put anything into print. Let's see it in operation."

From those first interactions, I was surprised at the level and degree of checks that were already in place for screening and AML red flags. But document examination was a whole new ball game. We were almost starting from a blank sheet of paper. What you would classify as being a fairly simple check could end up being 20 checks to cover all the areas you almost take for granted when you do something in your head. This isn't a simple five-minute job.

Within about two months of joining, I got this feeling that this could work. This could really work. But there was a lot of work.


Ezequiel: How has your view developed from when you came in, to where you are now with TradeSpeed?

Gary: I see my outlook and expectations changing almost every couple of months haha... That's the way AI is moving. Something that I thought a couple of months ago was not going to be possible for another six or twelve months is now possible. That's the reality in which we live.

When we started off it was a pretty slow process. We required at least 200 examples of documents in order to train the system: what a bill of lading looks like, what invoices look like, certificates of origin. At least 200. Then within a year that had gone down to 50. Then it came down to 20, then to 10, then to 5, and now we're at zero.

I think that because we are one of the younger companies in the space, we've been able to take advantage of the more recent developments in AI and continue to take those developments on board, because we're not tied to previous investments where competitors have invested their time and energy in creating their solution.

I think we're reaping the rewards of that now, when I see what we can do and what we will be doing over the coming months with a whole new paradigm of how we create checks.


Ezequiel: Let's get a bit into the weeds now. Walk me through the check creation process. How does a rule or check and the application of it exist inside your head, and how does it end up in the system?

Gary: The starting point is always what you know. You base it upon experience, upon your knowledge of the rules, and so on. When I receive a question today that I've never seen before, I still have to make a decision, because my clients expect me to. Making that decision in my head is, in a way, simple, because I can analyse it all in one go.

Getting that into an automated system is not easy. Because you have to create a check that's going to cover that scenario, but what about if there's a slight twist? What if there are different permutations? A single check, done properly, could be anything from a day to a week in development.

That is why we took the view very early on that we would not do separate checks where there is a common topic. For example, weights of documents. Instead of two separate checks for gross weights and net weights, and so on, we have one check that says, "Are the weights consistent?" Underneath that is a whole raft of checks covering every type and description of weights, across multiple documents. That dramatically increased our ability to turn around checks.

Not only that, but we've looked at how AI is moving, we've kept up to date with it, and we're always looking ahead as to what we want the AI to do that maybe it doesn't do today, so that we're ready when that functionality becomes available. If we know something is coming in the AI that's going to change how a check operates, we are already preparing for it to deliver the best possible version of it when it is time.

We're not just asking, 'What do we have today?' We're also looking to say, 'Is there something on the horizon that's going to change the way in which that check operates?’

“We're not just asking, 'What do we have today?' We're also looking to say, 'Is there something on the horizon that's going to change the way in which that check operates?”

Ezequiel: Once you've identified what needs checking, what does the handoff to the engineering team actually look like?

Gary: What we do is actually create an on-line document for every single check. That check will have a basic description for the engineers so they understand: How does this work? What do examiners and teams at banks do in a manual world today? What does this mean to them? What is this check, how does it operate, where do I see it, when is it involved? Is it LC-driven, rule-driven, or document-driven?

Number two: which rules in the UCP or practices in the ISBP apply to that particular check? This allows us to educate the AI as to what articles and paragraphs of the rules it needs to look at.

Then the scope: what do we want the check to do? In there we put all the different permutations that can occur. We basically say: what's allowed, what's not allowed, what's a given, what things are static, what things move, how does this change, how does it operate. All those kinds of things are built in, so that when we build the solution, all those factors have been considered for that particular check.

Finally, we provide all the different possible examples. "The LC says this, the document says that -- is this okay or not okay?" We give all the different permutations for that particular check, so that when the engineers come to create and test that check, at the beginning they've got the requirements, and at the end they've got: "This is the outcome you should be getting if these are the scenarios." And that's what they test against.

A lot of time is spent on this process in order to make sure that we cover everything feasible within a letter of credit and within the rules.


Ezequiel: I bet that sounds easier than it actually is. So that covers the checks you know you need. But letters of credit can contain conditions nobody has seen before. How do you account for that?

Gary: We can never account for what a bank or an applicant will put into a LC. No one can. Yes, you have a Swift message, which is largely structured, but we have three fields where anything goes: 45A, 46A, and 47A. There are things that appear in these fields that shouldn't be there. They should be elsewhere within that MT 700/710.

So the challenge we face is: how do we build checks for the unknown? Because at the end of the day, we cannot have a situation whereby a presentation is made under a letter of credit where field 47A, for example, includes a condition that no one has seen before and we don't have an explicit check for it. Because if there's no check, there's no examination, and that could be critical for a bank. That could be absolutely bad news and lead to the bank accepting the documents and another bank saying, "No, it's wrong because of this part in field 47A."

So we've spent a lot of time developing prompts to aid the checking of certain fields of the MT 700/710, namely fields 45A, 46A, and 47A. We establish these prompts so that TradeSpeed is looking for them. If it looks for them and finds them, but we don't have a check for it, TradeSpeed will indicate to the user that they need to investigate.

Nothing drops down this big black hole and disappears. That has been my aim from day one since I joined.

“Nothing drops down this big black hole and disappears. That has been my aim from day one since I joined.”

We cannot have a solution that says, "Yeah, we're 80% sure this is going to work." We have to have a solution that says, "Right, 99% we have got the checks that will cover your presentation. For the other 1%, we have instilled within TradeSpeed certain prompts, so that if something appears for which we haven't yet got a check, it will highlight to the user to go and look at that particular piece of data."

So in 99% of cases, when a presentation is uploaded to TradeSpeed, there will only be two indicators: OK or NOK. But for that 1%, there will be: OK, NOK, Investigate. That's not a failure of TradeSpeed. It's not a failure of the AI. It's a constant issue that arises when banks put in conditions that they shouldn't normally put in (at all or in the concerned Swift field), or even conditions that they themselves don't understand.

How do you train a system to understand something that maybe the bank doesn't understand or hasn't used before? The way we do it is via these prompts, and that allows us to manage a complete examination of the presentation.

We are not just content designing great checks for what we know. That isn't good enough in my book. We have to do the unknown. That, I think, is the true spirit and intent of TradeSpeed.

"We are not just content designing great checks for what we know. That isn't good enough in my book. We have to do the unknown."

Ezequiel: If you were to start with a blank sheet of paper and your document was an invoice, how would you start?

Gary: What we would do is create a document outlining all the criteria for what is necessary for an invoice. This goes back to what I said earlier about creating this page for each check, where we would highlight what checks we would need.

First point of call is the UCP: What does UCP say about the invoice? What are the key pieces of data that need to be in an invoice? Then the ISBP, which has a whole section on invoices. Those two give you quite a lot of the data that has to appear.

Then there's a third category: what do we actually see in invoices? On average, 80% of the data that beneficiaries put on their documents is not required. But because it's there, the banks have to check it. And because it's there, TradeSpeed has to check it.

We have previously listed out what data could appear on each document type. Not what will appear. What could appear. We're picking up the 100%, including the 1%. We ended up with something like 150 data points per document as an average. If it's a possibility that it could be there but you don't normally see it, we still cover it.

From there, we create the programme of the check: what is the check, what is the discrepancy language that will apply if that check is NOK. I've personally spent a lot of time developing those discrepancy wordings. Then we get into creating the individual on-line pages I mentioned that explains the scope and all the examples. Every single check has one of these pages. That's the degree we go to. It's not high-level, a five-minute call via an on-line chat platform. This is fully documented. When anything changes with the rules, that page is updated, and they can go back and re-address that particular check.

And there's a realisation that things have to happen across the board. If we create a check for an invoice and that check is relevant to any other document, we complete it for that document at the same time. We don't come back to it later. That's the process we go through, it's about doubling up.


Ezequiel: That is really thorough, thanks. And what about the people at the other side of the check development process? Mathieu, Siebe, or the rest of the tech team do not come with a background in the trade finance industry. How do you manage this knowledge transition and back-and-forth?

Gary: Honestly, I must say that four years ago their knowledge of trade finance was pretty limited. I think the process and depth of trade finance has really been an eye-opener for them. When I see the emojis on some of my messages, I know that this is an "oh my God, what are you telling me?" kind of response from them.

But the reality is, they grasp it. If you explain it to them as to why it's necessary, what you expect the check to do, then there's a simple outcome: can we do it today, can we do it tomorrow, or is it something that's on the horizon? That's the way we operate.

We've taken the view that there's no point in putting out a check that's basically 70% complete. Because then you're saying, "You can use this check for this, but you can't use it for that." And then when something changes, your list of what you can do shifts.

We think it's far better to say: let's work to the solution where we can give a 100% complete check and say, "Right, this check is available for you to use, for every case."

But the way we operate between myself and the tech team, we send messages at half eleven at night and seven o'clock in the morning. It shows that people care about what they want to do within the solution. From wanting to learn, or finding alternatives to things that come up.

It is worth saying that a few of the tech team could now take up positions in a bank as a document examiner - not that we want them to leave us!


Ezequiel: Can you give an example of a check that's pushing the boundaries of what AI can do right now?

Gary: Signatures. Signatures are a big issue in document examination. It's not just the question of, "Has this document been signed?" Clearly, you look at it and say, "Yeah, there's a signature." But it has other issues. A signature, for example, if it's made in handwriting, is a way of making a document an original, which is another check altogether, a completely different check.

If we're uploading a PDF, how is TradeSpeed expected to know that a signature is handwritten as opposed to a signature added as part of the document creation? Which would be a copy under the UCP. Or the actual document that has been uploaded is actually a photocopy, not an original, and there's no original signature. How would TradeSpeed know?

We've spent a lot of time on this: different types of signatures, where they're signed, who has to sign, how they sign. It has ramifications across every single document in a letter of credit.

After working on this for a few months now, we're at the point where we are fairly confident that the latest version of the AI will be able to identify pen strokes. We are not there just yet, but we've been working on the basis that we will be ready when the appropriate AI solution becomes available. We're not waiting for the capability to start thinking about it. We've already got it in the pipeline.


Ezequiel: We've talked a lot about the technical side. But what about the people using TradeSpeed day to day? How should they think about working with the system?

Gary: We don't seek to change what the user does. We don't seek to change the process that a bank has for examination of documents. What we want is TradeSpeed to work alongside the user. It does the hard work for you, and you sit there and make the decisions.

I think a lot of people I've talked to about AI haven't got that. They haven't understood that point. That's the way you have to look at it. If you work in a bank today, I can more or less guarantee that when a set of documents is received in a certain region of the world, every single word has a pencil tick. That's the degree of examination going on manually.

What we're saying is: forget your pencil. You don't need a pencil anymore. TradeSpeed is going to do that. What we want you to do as a user is to become a decision-maker. You do not become a pencil-ticker. Elevate your role. Become more involved in the process. TradeSpeed allows the people to have that elevation in their role.

"What we're saying is: forget your pencil. TradeSpeed is going to do that. What I want you to do as a user is to become a decision-maker."

If a beneficiary client continually gets documents wrong for the same reason, the root cause is often that document examiners don't have the time to speak to them. But if those 20 to 30 sets of documents that document examiners are completing manually today are being handled by TradeSpeed, suddenly a batch of time becomes available. They can sit down with the client and say, "Look, you presented this document, it was wrong for this reason. If you did it this way, you'll be correct." Boom.


Ezequiel: How much can AI be trusted in this space? How much should be automated?

Gary: AI doesn't get rid of the user. And in letters of credit, the answer is not always black and white. There are grey areas where you can go this way or that way, and banks have to make decisions on that. That is where the human element comes in. There still has to be an element of human involvement in this process.

On the other hand, the reality is that a lot of document examiners don't have their own access to a UCP. I've come across people who have never had nor fully read the UCP. They've been taught it, they've got PowerPoint presentations that show them various bits, they were trained on, they've also never seen or have access to the ISBP.

We're building into TradeSpeed the requirements that exist within UCP and ISBP and we point people in the direction of the actual paragraphs and articles to explain why TradeSpeed made its decision.

What we are doing is building confidence in the user that TradeSpeed is capable of doing what we say it does. We've also built check logic, which is basically explaining the examination step by step: what did the system do to reach its conclusion of the check being OK or NOK? This is effectively the process that the user would have gone through manually.

And our customers and demos to banks show we've done it correctly. That can certainly be trusted.


Ezequiel: Do you think there are certain things that TradeSpeed brings to the table that an examiner might not?

Gary: If I may, I'd like to change the question slightly. Because not every examiner is the same. Especially with juniors, I've come across junior examiners who have picked it up in three weeks -- boom, off they go. Others may take three months or more. So it's really down to how they apply the rules, their understanding of the UCP and ISBP, their dedication to actually learn, etc.

On another topic, if I gave out a set of documents to 20 examiners, told them what the discrepancies are, and said, "Write me the discrepancy wording", I would probably get 15 or more different wordings for the same discrepancy.

Now, providing each wording is OK, it probably doesn't matter. But consistency is key for any bank.

Similarly, a common complaint by beneficiaries is: "I gave you a set of documents under this LC three weeks ago, and you told me they were okay. I presented another set of documents today, exactly the same data content, same amount, same quantity, but a different shipment date, and you give me a discrepancy. Why is there a difference?" Answer: Different examiner. Different reading of the same credit.

What TradeSpeed brings to the table is consistency of approach, and that is something that is missing in the manual world today. 

"What TradeSpeed brings to the table is consistency of approach, and that is something that is missing in the manual world today."

Ezequiel: Is there a case where the system has shown something that even you were surprised by?

Gary: I am no longer surprised by anything the AI can bring to TradeSpeed. If anything, I think my surprise has been more over the last six to nine months. This is where I know there are certain checks we have not built yet for a certain document, but we've uploaded a presentation containing that document, and even with my scepticism as to what the output would be, I've been very surprised at what TradeSpeed has achieved.

From a data extraction perspective and categorisation of that data, I have been surprised, when we haven't really touched that type of document in any shape or form.

This comes back to what I said at the beginning, about how many examples we needed for each document. We are at a stage where we can throw a document in, the AI understands what that document is, seems to have an understanding of what kind of data should be in there, and flags it without there being any checks.

Now, we still have to build the checks for this. But the system is helping us get almost 70% of the way there regarding each check, and we've only then got to put the peripheral bits to narrow down what the system has found to that particular check or group of checks.

That has been what has opened my eyes to the way in which the AI is moving, and what we can keep on developing and how we want to run checks in our next generation of checks.

AI is moving, and we move with it, and sometimes ahead of it.

“AI is moving, and we move with it, and sometimes ahead of it.”

Ezequiel: When you think about where the work is heading – towards checks that react more dynamically, that handle ambiguity, that get closer to how an experienced examiner thinks – what can you tell us about this? What excites you?

Gary: I think we're still in a relatively early phase, to be honest. There are still things we need to look at. Can it understand and move forward on a standard transaction? What about a non-standard transaction?

For the original checks that we've been developing over the last four years, I've created the discrepancy language for every single one of them and they're all in a consistent format.

But the AI, as is now being developed for our next generation of checks, has been fed the kind of structure that we want discrepancies to be described in. Just the structure, not any specifics.

What I'm finding now is that we throw the document in, the AI will look at the document, it will find something wrong, and it will draft a discrepancy wording that's 98% exactly the same as what I would have drafted. That is impressive.


Ezequiel: Wow, really impressive. Gary Collyer as a Service!

Gary: Haha, it almost makes me redundant! But that's the reality and we move onto other aspects. A year ago, forget it. If we didn't have discrepancy wordings, the banks would have had to do it themselves. But now, TradeSpeed is actually saying, "Right, I understand how you want discrepancies to be phrased" and then does it. And the reality is it actually brings in more data that it gets from the specific documents.

Those kinds of things are pretty impressive in how we're moving forward and how it will enable us to potentially increase our ability to do more checks in a shorter period of time. That's got to be the goal. Trade finance never stands still.

Things change in trade finance. If suddenly in two years' time the ICC says they're going to revise ISBP, we're going to have a whole group of new or revised checks to potentially create. If we can just say, "Right, we can throw that at the AI and it's done" as opposed to spending days or weeks developing these checks? That's a win-win.


Ezequiel: If you could put one thing on record about how AI and examiners should work together, what would that be?

Gary: It's what I said earlier. Let's not beat about the bush, we go to a prospect, a bank and sell the idea of automated document examination to the heads of trade transaction banking, the product heads, the heads of trade finance, seniors, ... and then you start demoing it to the current crop of document examiners, the people on the ground.

There is fear. We sometimes see fear in their eyes. "My job's gone."

And I see it on WhatsApp groups, on LinkedIn. People saying, "AI is going to kill the manual examiners, there's no jobs for document examiners."

Forget it, there is. But why should a document examiner still be sitting in that same chair in another five years ticking off pieces of data, when they can have an automated solution doing the grafting? Wouldn't they prefer, as the user, to be the one making decisions?

That's what TradeSpeed is providing. A person making decisions. A person saying whether the highlighted discrepancies are correct. TradeSpeed is allowing LC staff to move into a more senior kind of role: decision-maker.

“That's what TradeSpeed is providing. A person making decisions. A person saying whether the highlighted discrepancies are correct. TradeSpeed is allowing LC staff to move into a more senior kind of role: decision-maker.”

And also into a role where the person can spend time with a client, explaining where they went wrong, why they went wrong, how they can correct it and actually moving, almost inadvertently, into a business development kind of role. A person demonstrating to the client, "We've got interest in your business, we want to help you." That can create more business, especially where clients are multi-banked.

So the reality: don't look at TradeSpeed as a competitor or a rival. You're not going to lose your job. Arm in arm, you're working together. You can put the documents into TradeSpeed, go off to the coffee machine, get your coffee, come back, and the documents have been checked for you. Then you make the decision.

Working in a different environment, a different type of role, that will be good for the individual, and their personal development. 

As you know full well from the articles we've released on the expertise crisis in trade finance operations, there is a lack of new blood. Why is that? People don't want to sit there ticking off bits of data for 10-20 years. In today's world, that's pretty boring.

But if you can say, 'We have a solution that does all that work, and what we're asking you to do is come in, make decisions, speak to clients, educate clients.' Completely new role, completely new development. I think that's the way in which it should be thought of.

"You can put the documents into TradeSpeed, go off to the coffee machine, get your coffee, come back, and the documents have been checked for you. Then you make the decision."

Ezequiel: This has been enlightening, even for me. Thank you Gary. Any final words?

Gary: No, I think I've spoken enough. I told you at the beginning, I can keep talking forever on trade finance.

Try it yourself


If your team is still pencil-ticking every word of a set of documents, we'd love to show you how TradeSpeed works in practice.



We'll walk through a real presentation and show you how TradeSpeed handles the full examination, from UCP and ISBP rule coverage through consistent discrepancy language, check logic, and sanctions and AML red flags.



Gary Collyer

Head of product at TradeSpeed

Gary started his banking career in 1973 with Midland Bank PLC in London and spent more than two decades in trade finance with various leading global trade banks before setting up his own business.. Gary was the chairman of the drafting groups for the UCP 600 and the ISBP 745, and more recently co-chair for eUCP, eURC and ISBP 821.

Ezequiel Djeredjian

Head of marketing

Ezequiel is a marketing and go-to-market leader with a strong background in FinTech and RegTech. He has spent the last six years at the intersection of product, brand and growth in early-stage technology companies, and joined Complidata two years ago to lead marketing and brand for TradeSpeed.


bottom of page