Trackernet HTTP 503 error: service unavailable

The xml feed at

Is consistently returning 503 service unavailable error.
Have checked other lines feeds and sam result. Is this a know issue?

We’re using those ‘predictionsummary’ URLs. I saw that error, but it seems be intermittent. Happening now alongside data truncation problems I’ve mentioned in another thread, which seem to have picked up again over the past week or so.

More consistent is the failure on this LineStatus URL (which we also use):
We’ve been getting 503 Service Unavailable from that since 4am today.

1 Like

503 errors still very common yet still no response from TfL about fixing it. Good job they have these forums huh. Really helpful :joy: @davidtan


I’ll raise a ticket with the team that looks after trackernet.

Do you have the approx time you last saw the error? this will help them pinpoint the issue in their logs.


Error 503 showing for at approx 06:02 28/06/2017
Feed ok again at 06:03. @jamesevans


I have raised a ticket with the app team responsible for Trackernet.


Ref: INC000002520412

@ryan.forsyth the Unified API Arrivals endpoint should provide a more stable way to consume predictions information. If there’s information from Trackernet that we’re not surfacing, let us know, thanks.

The data size returned from those feeds is triple what the predictionSummary is. It’s going to consume a lot of my users data just to parse it 3x as much actually. Not sure where the improvement is for the user ?

1 Like

Is your user consuming the Unified API directly? I agree some of our APIs are not optimised for client-side consumption directly. Mostly they would be used server side and then a minimal set of data returned to the client.

Which fields from which endpoint would constitute the minimal set for you? We could perhaps make a cut-down version of the endpoint available (some of the APIs already have flags allowing you to filter out parts of the response you don’t need).

I would be looking to use the >**lineName**/arrivals feeds.
It wouldn’t be practical for me to parse the data server side and then send to client a minified version. Its an unnecessary
speed bump as with the time sensitivity of the data no sooner have i parsed it sever side whats delivered is already out of date.

Could it be possible to limit the keys that we are interested in so data returns only what we need?
out of all this…

“$type”: “Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities”,
“id”: “281333791”,
“operationType”: 1,
“vehicleId”: “322”,
“naptanId”: “940GZZLUWHP”,
“stationName”: “West Hampstead Underground Station”,
“lineId”: “jubilee”,
“lineName”: “Jubilee”,
“platformName”: “Northbound - Platform 1”,
“direction”: “inbound”,
“bearing”: “”,
“destinationNaptanId”: “940GZZLUWYP”,
“destinationName”: “Wembley Park Underground Station”,
“timestamp”: “2017-06-29T16:19:05Z”,
“timeToStation”: 952,
“currentLocation”: “At Westminster Platform 4”,
“towards”: “Wembley Park”,
“expectedArrival”: “2017-06-29T16:34:57Z”,
“timeToLive”: “2017-06-29T16:34:57Z”,
“modeName”: “tube”,
“timing”: {
“$type”: “Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities”,
“countdownServerAdjustment”: “00:00:00”,
“source”: “0001-01-01T00:00:00”,
“insert”: “0001-01-01T00:00:00”,
“read”: “2017-06-29T16:19:05.61Z”,
“sent”: “2017-06-29T16:19:05Z”,
“received”: “0001-01-01T00:00:00”

I just want about 40%


“vehicleId”: “322”,
“naptanId”: “940GZZLUWHP”,
“stationName”: “West Hampstead Underground Station”,
“platformName”: “Northbound - Platform 1”,
“direction”: “inbound”,
“destinationNaptanId”: “940GZZLUWYP”,
“destinationName”: “Wembley Park Underground Station”,
“timestamp”: “2017-06-29T16:19:05Z”,
“currentLocation”: “At Westminster Platform 4”


If i could list these keys as parameters that would be so much better. After all some of the keys returned are redundant. For example “lineId” & “lineName”. Not only are the both the same the are also already known by the given URL which fetches the data. expectedArrival"& “timeToLive” both appear to give same data so are both needed? I also don’t need both a “towards” and a “destinationName” as one would do.

Also “timestamp” does it really need to be duplicated over 500 times? Have that moved to its own object would save a good few bytes of data.

Ah I see. Yes the developers here have discussed having a generalised way to filter the JSON after serialisation but before returning to the client. This could be an optional parameter that specifies an array of properties to keep - all other properties could be dropped by the server. I’ll see if I can get that idea resurrected. TECH-247

In the meantime though, you can get a smaller, more timely response to your client, specifically for Arrivals, using our websockets layer with SignalR - there’s a demo at the end of the blog entry I linked to.

1 Like

Yes an array of properties to keep would be an idea. Again though even not repeating the timestamp 500+ times would help. I’d rather stay away from SignalR as it’ll mean another code base I need to maintain and it’s bad enough having to cater for Android users :joy:

You should be aware that repeated names compress very efficiently, so even though JSON might look crazily verbose, its actually fairly efficient over the wire. If you pass an accept header for compress/deflate/gzip our servers will do this compression for you, and depending on what programming framework you’re using on the client side, it may well get decompressed for you automatically too.

  ~  curl -s -H 'accept-encoding: gzip, deflate' > with_gzip.gz
  ~  ls -la with_gzip.gz
-rw-r--r--  1 tim  staff  9145 30 Jun 14:41 with_gzip.gz
  ~  curl -s > without_gzip.json
  ~  ls -la without_gzip.json
-rw-r--r--  1 tim  staff  245644 30 Jun 14:42 without_gzip.json

9KB vs 245KB - pretty good compression!

The timestamps are repeated because they could, in theory, be different for each element.

1 Like

Accepting that the Unified API may be more stable, I have began writing my application to use that instead of the Trackernet feed. However when parsing the two feeds and limiting my application to only unique occurrences of vehicleId/set Number, the Unified API returns fewer trains.

from the Trackernet predictionSummary feed unique train/ set Numbers

[“352”, “351”, “350”, “347”, “346”, “345”, “344”, “343”, “342”, “341”, “340”, “337”, “323”, “322”, “321”, “317”, “307”, “306”, “305”, “304”, “303”, “302”, “301”, “165”, “164”, “163”, “162”, “161”, “160”, “157”, “156”, “155”, “154”, “153”, “152”, “151”, “150”]

vs the unique train/ Set Numbers returned from the Unified API at the same time

[“352”, “351”, “350”, “347”, “346”, “344”, “343”, “342”, “341”, “340”, “323”, “322”, “321”, “317”, “307”, “306”, “305”, “304”, “302”, “301”, “165”, “164”, “163”, “161”, “160”, “157”, “156”, “155”, “154”, “152”, “151”, “150”]

in the examples given following are missing from the Unified API


Why are these trains missing? If Unified API is only going to find 90% of the trains found by the predictionSummary then why should I use the Unified API?

UPDATE: I believe this forum post has the answer to my question. Trains that are at a terminus or out of service can not arrive so not shown in the Unified API. Though a ticket was raised 13 days ago nothing has progressed.

I wanted to give an update on the issues with 5XX errors on the The application team that look after that system have been having trouble finding the issue. We’ve put in some enhanced monitoring which should help us find the root cause of the issue.


1 Like

The issue was the box running out of RAM - at least that’s what the IIS error said…

Thanks for the information. Hopefully that will point is in the right direction.

I’ve been advised that the Trackernet application team have put in a fix yesterday afternoon that should improve the stability of this service. Our monitoring hasn’t picked up any errors since that was performed. If you see any further issues, please feel free to let me know.

1 Like

That’s great. Thank you & the Trackernet team. I was going to move onto the Unified api and use arrivals. However as that doesn’t show trains at terminus, it’s an unusable option. So thank you for fixing issues with the feed.

@ryan.forsyth - No problem - I’ll keep persevering with the terminal arrivals/departures issue as it causes confusion for website users as well as open data consumers. I’ll keep that thread updated when I have more.

1 Like