I have a quick question when it comes to wifi - when all we are talking about is tablets talking to a computer, is there a way to do this via bluetooth? That way scorekeeping could be instantaneous. The computer could push it to the website via wifi, but the scorekeepers would only be talking to the computer. I am probably not understanding something though.
Bluetooth connections are way more finicky to get going and more importantly do not have the required range.
You’d also need custom software on both tablet and computer. Someone has to write all that software. It’s significantly faster to write webapps than reliable bluetooth interfacing software.
The app helped a lot this year in figuring out where I was in the queue. I like the push notification when you moved up a spot. One suggestion would be in the notification to tell you which spot you are in. “You are now 11th in line for Whitewater”. That makes it so I don’t keep having to refresh the app each time I get a notification.
Agreed on everything here. Loved the electronic queuing and the push notifications.
@ASG One further note on possible bug: on my iPhone, when I opened the app to my applicable queue screen from the iOS notification, the light blue banner and associated menu button were gone, and I couldn’t get out of that pin’s queue screen unless I shut down and restarted the app, which also required to re-login each time. It was only an issue once I’d played my game and wanted to get into another queue.
Yup, I told Aiton about this bug. As a workaround, if you swipe your screen from right to left, you can bring up the menu where you can click on the Home option.
Well what you really have is… multiple computers talking to a central computer. Android tablets are used because of their portability, touchscreen, and they are DIRT CHEAP.
As mentioned earlier, Replay brought in their own wifi infrastructure to host the tournament with… and for reasons not covered yet… it did not perform up to plan.
Bluetooth range is limited, and introduces a lot of dependencies. Writing stuff that is just straight up IP based and browser based is really universal… and that flexibility really let people adapt and use their own devices easily, etc.
My concern with the app is the slow response. The payload we are pushing around here is not ‘small’ - it’s trivially microscopic. The overhead of the connections and headers probably outweigh the actual data by factors of hundreds. And the server is not handling even hundreds (or even dozens) of write requests per second. There should be little to no blocking and transactions should be completed in milliseconds.
There are not complex tables or graphics in the UI in the scorekeeping screens. But even now browsing the site while its effectively idle… on a modern laptop… the renders still take significant time.
Retrieving data is slow too… just pulling up a queue waits more than 300ms just for the server to respond. It seems the rendering and server side are the actual bottlenecks.
I originally thought maybe TLS setup was hurting them and they needed to do more keep-alives to ensure the connection setup doesn’t have to keep repeating… but looking at it this morning… still seems like the backend itself has some optimization it could use.
Likely they are using some sort of phonegap style framework to write a universal app. Those are usually a lot slower than native device apps; also the backend might be a singlely-threaded node or other framework which are less performant.
The transitions often take longer than the data retrieval – it’s a conscious choice to have those transitions.
On a laptop the site is setup in a way where you have significant overhead because each template is fetched individually, on demand when first needed. I’m fairly certain that’s not happening if you’re using the iOS/Android app.
300ms is not super duper amazing, but it’s perfectly fine. I wouldn’t consider that something to give first priority. The click delay on an iPhone is even 300ms (i.e. the iphone will wait 300ms before processing a click on a link in order to wait for a potential double-tap to zoom)
Besides trying to avoid double-taps on a touch device… I don’t agree with them. Use the visual effects to build that sense of change.
It was pretty much front loaded on the landing page at the front… seems like most of the templates and js are loaded right on the front page. And the timing I was looking at was waiting for the response, not waiting on cache or page loads. Using chrome’s devtools to break out the timeline in rendering, network, etc.
It was taken as an example because it was a simple, read-only, non-blocking query with minimal dependencies. When looking at the different elements of what made up the transaction that was perceived as as slow… the largest portion (by more than 10x) was waiting for the query’s response… for an element that technology wise shouldn’t take anywhere near that.
When as a scorekeeper, I have to submit a score… then click on the next person in queue… then click start… and all three of those pages had significant delay before control was passed back to the user after each step. Yes, I raise an eyebrow when even a trivial read-only query takes nearly half a second. Because they add up.
And it’s not the network when we already have a connection, latency is low, and we are passing a couple hundred bytes.
I like the idea of buying in when I purchase Pinburgh tickets. Let me choose a charity then too. Then on the day of I just need to get my player number and pin and go.
I think more signage would help too. A big sign at the desk with the charities listed so you can decide while in line, and the poor volunteer doesn’t have to read that list again. Throw the relevant URLs on there too, and maybe a QR code for the app so I can get my phone ready to login once I get my info.
Maybe have a staff member at a table by the tournament strictly for queuing those people with dumbphones and dead batteries? This way scorekeepers can keep up with scores better. Maybe this only seemed problematic because of the aforementioned wifi issues. I liked that I could queue others with the app, that’s a slick feature.
I think I preferred last year’s setup with the chair queues. You knew who was ahead of you or missing, no surprises. It lended itself to be more social too, chatting with random people in line. I feel like most of the queue issues this year were from people sitting ten feet away at the tables that just couldn’t hear their name called.
I was queuing up down to the wire this year just to play all of my entries. I felt like last year I got to finish earlier and still had time to enjoy the rest of the show.
I still had a great time though, and thank all the people that put the time and effort in to make it run as smoothly as it did. I had an issue where a skipped player was playing my turn, and the staff was right on top of it. I checked the queue and it said I was playing! Before I even got to the scorekeeper onsite he was already correcting the goof. Top notch.
That’s in the top 3 requested features, and it’s the highest priority for the next release.
@flynnibus : thanks for the feedback on the performance. The good news is I designed the PSS to be as fast as possible. The bad news is I took a bunch of shortcuts during implementation to meet the deadlines of LAX 2017 and PAPA 20, and it seems like they might have affected performance.
At this point, it would be a huge help if you could help me isolate these problems - if you are interested, please DM me so we can talk.
I had a friend that finished a game when the wifi was down. He was waiting by his game chatting with others who were also waiting for the wifi to return. At some point a non tournament player walked up to his game and pushed the start button and erased his score.
Yes, Replay needs better enforcement of “tournament area only.” I personally gently asked at least half a dozen people to stop playing tournament games at various times between rounds. They had no clue they were doing so - - no signage to distinguish the free play area from the tournament area. The “barriers” were just section-of-the-room dividers to them.
There are actually eight, four foot high banners when you enter the area proclaiming the tournament games and when they will be open to the public. It really doesn’t matter how many signs you put up, some folks just won’t notice them.
Right but when the intergalactic started, most games were open to free play.
The answer is simple network saturation. Too many people trying to connect via Verizon to the same server at one time caused severe load issues.
As Doug mentioned, the DLCC really does want $50k to turn on public wifi. You’d think they’re dropping new fiber to the entire hall for that kind of change, but the simple reality is that they vastly overcharge on this service. After realizing early in the week that we would not be able to scrape by with the meager wireless we cobbled out of them last year, I ordered two Cradlepoint cellular modems specifically designed for convention/stadium type usage.
And it worked awesome on Thursday!
Friday we saw some degration in the connectivity. It was far worse on Saturday, and I apologize. All I can tell you is that I spent 2 hours before we opened trying to make some changes to help it, and then I made Aiton and Kevin work on it for at least an hour, and I spent 3 hours in the early afternoon trying to help it, before I decided I could no longer look at anything network related for the remainder of 2017.
Last week I ordered some better antenna, which should boost the single and provide a stronger connection to the modems. We’ll be testing this in Buffalo. (Hey, @ASG you wanna come to Buffalo?)
We would love your help. Please feel free to reach out and offer your skills: PAPA 20 results online? (PAPA Scoring Software Feedback)
As a user, the pinburgh web updates seemed great all weekend. Only noticed the hiccups during A final. Overall I’d say it was pretty awesome.
Would it be possible to bring a web server to pinburgh with a papa scoring backend application on it? Then run everything locally - still over wifi? And maybe the local server syncs every 5-10min with the online one. That way if the internet is down/slow the show can still go on… and when it returns the local server can update the remote one. I understand this is not trivial. But I don’t know if it’s 20 hours of work or 200 hours of work to set something like this up. I am also aware there are other higher priority issues. This idea has been kicking around in my head and I just wanted to get rid of it. ha ha.
When building webapps you generally choose between live updates or offline availability. Syncing is extremely finicky and time-consuming to develop for, especially when you combine it with self-queueing. Possible, but not really when we’re talking volunteer efforts (just because of the time needed to get it right). It’s a choice made before you write the first line of code, not something that can be tacked on after the fact.