Is Xojo Web Performance REALLY that bad?

Is Xojo Web Performance REALLY that bad?

That’s less than 20 users per instance!!!

It is. While I can host on the same server a vaadin flow app with 800 concurrent users Xojo is not good for 100. Exactly

Yes they are. On a prototype app in Web 1.0 or 2.0 I got 15-20 max users, whereas with Vala/libsoup on the same box I had no issues with well over 200, which was the number of users I needed to support with that web app.

Yeah, it’s not a good situation. When WE was in its infancy, the goal was compiling to JavaScript and running in the browser, but the tech wasn’t mature enough at the time. These days, WE3 should exist (in my opinion) and do exactly that. The server has far too much work to do.

For a Web 1 app we could get about 35 users logged in before it would crap out. Once we implemented a timed logged out and put reports and email in separate console apps they only hit the max a couple of times a week. They could live with it at that point.

Not hate, not trolling, those were the actual numbers: 15 users per instance!!!

However:

  • This app had ALL the users logging in at the same and most of the interactions from the users could also happen at the same time. Other kind of app could handle more users.
  • This are Web 1 stats, web 2 is still to buggy to spend money on it.

In this case, it was done in xojo for the RAD features and short time to develop, The app was used by groups of 100 to a max of 200 users in periods of 10 mins to 3 hours max, running 12 instances was fine. But for more users or an app that need constant usage, xojo is not an option.

Fun fact, this app was used over the INTRANET, Im not sure if those 15 users per instance are the same over internet :thinking:.

2 Likes

Depends of course on the Intranet, but usually an Intranet should have faster ping times, etc. I had a customers for a Web2 Intranet with 25 users. approx up to 20 users all was “fine”, above 20 concurrent logins it became troublesome, I would not even think trying to go beyond 30. And this server had fast SSD, 128 GB RAM, enough cores - in a nutshell => big disaster and a shame. So it must have been me, as Xojo is fast as hell! :slight_smile:

There was a good reason why I suddenly walked away, annoyed and angry :cry: - and I rarely get angry, but in this case…

Considering it was not your fault I would say they where betraying with the servers capabilities. Maybe it was a 386sx computer with 16MB Ram?

Maybe that was the problem, my server only had half of that RAM :upside_down_face:

Frustrating, I know :frowning: In the “tests”, I got 30 users per instance… decided to go with 20, and in the first real usage with 120 users… MANY of them had disconections, I had a little pannic attack, but with xojo being xojo, I was kind of prepared, launched the exta instances and had a successful run with only a couple users annoyed. :sweat:

2 Likes

There is an interesting post with a detailed description of the setup for a Xojo-based backend:
https://forum.xojo.com/t/multi-user-web-app-with-a-great-many-simultaneous-users/76229/39

I am lacking the backend expertise to tell whether this is good or not so good in terms of Xojo’s performance. The expert’s take?

I’m not an expert, but it seems like a lot of hardware for an API.

Well it is a lot of requests too. But by the end the transfer of data in usually the smallest part for APIs. Some JSONs (so text only). The bottleneck is usually the database behind and of course if the backend logic is cable of running multiple threads in parallel and being capable of using all the cores and the hardware performance.

We know what Xojo can do here, so I think it is fair to conclude that on the same hardware Go, Java, etc. will outperform Xojo all the time. And those solutions are for free …

10^6 requests per 24 hours is 11.5 requests per second. With 5 servers its about 2.3 requests per server per second. Considering that load is not equally distributed over 24 hours, we could assume 10 times more requests per second at peak time. That would be about 23 requests per second per server (instance).

Of course it is. Our Java API handles about the same amount of traffic without even needing load balancing yet. At about 5 million we plan to start a new instance although tests showed 10 million was the breaking point in peak moments but we want to play on the safe side.

Running on one Debian server, 4CPU (4 Cores / socket), 8GB RAM. The Java API can handle about 2500 request/second. Same server also runs our MySQL, Web Apps and a couple of background workers handling various tasks and data.

3 Likes

write a Java api for it and use one Mac Mini M1 with 8GB Ram for it: done. What a Clown. There is not a real performant solution with xojo cause it is: too slow. And that is not changable. By the way it is not running stable not on Macos, not on Linux, not on Windows. What the heck shall that be. A few connections with 5! Mac Mini…oh my god.

2 Likes

same with vaadin. I have some of them running.

There is no need for that kind of Solution except for users which want to have Xojo as a language. But performance is always bad.

1 Like

yep, plus most of Xojos users don’t have millions of request. Buy a 5 USD server with and most people will be more than happy with the power for a sole API (some microservice framework for Java, Go, Rust, python, … you name it) and a postgres, mongo or mysql db. Done. Using Xojo for sole APIs is IMHO not only a joke but a waste of resources.

#disapprovedByGreta

1 Like

my favorite combination is Java and HSQLDB

1 Like

I’ve never run into a case where the DB was the bottle neck with a decent DB (postgres, oracle, ms, etc)
Never run a site with sqlite so no comment there

EDIT : for context a near real time system I worked on used Oracle and easily handled 70K insert per minute
AND an expert system reading data from it to do analysis of current system state & operation

People on TOF believing that Java Solution is an Eye catcher Phrase…Man, loosing the reality in thinking is dagerous