Could Someone Give me Advice on Optimizing a Complex Xojo App for Better Performance?

Hello there,

I have been working on a fairly complex Xojo application that handles a significant amount of data processing and user interface interactions. Despite my efforts to optimize the code; I am noticing that the application becomes sluggish when dealing with large datasets or when multiple UI elements are updated simultaneously.

This has become a critical issue; especially since the app is intended for real time data analysis.

The app frequently reads and writes data from a local SQLite database. The database interactions are quite frequent; especially during peak usage times.
The user interface is rich and includes multiple listboxes; text fields; and custom graphics elements. I am using a lot of dynamic controls which are created and destroyed at runtime.

I have noticed that the main areas of slowdown occur during database operations and when updating the UI. For example; populating a listbox with data from the database takes a noticeable amount of time; and there are lags during UI updates when multiple elements are changed at once.

I have indexed the database tables and optimized the queries to reduce the number of database calls.
I have attempted to batch updates and reduce the frequency of UI refreshes.

Despite these efforts; the performance gains have been marginal. I am reaching out to the community for advice :blush: on more effective strategies to optimize the performance of my Xojo application.

Also, I have gone through this post; https://ifnotnil.com/t/is-xojo-web-performance-really-that-bad-golang/ which definitely helped me out a lot.

Best practices for handling large datasets in Xojo without compromising UI responsiveness.
Techniques for optimizing database interactions; especially with SQLite.
Efficient ways to manage dynamic UI elements and minimize redraw times.

Thanks in advance for your assistance and help. :innocent:

First of all… What Xojo version? What OS are you running it on? Is it a Web App?

SQLite is NOT made for that kind of ussage :expressionless: You need a real DBMS for that.

Why are you not reusing them?

I second that. I ran a genome PCR analysis of the bovine genome that I wrote myself on an old Mac Pro from 2010 (96 GB RAM, 3.47 GHz, 2 6-core CPUs, 12 cores) using 22 cores via re-used workers (each dealing with analysing 100,000 basepairs at a time) that ran 2-6 Hours (depending on which chromosome was being analysed) and inserting 50,000 records+ each second into a postGres database - postGres didn’t even break a sweat.

The folks on StackOverflow said it couldn’t be done on such an old computer, that it would take months or that my computer would crash before finishing - it took 3 days … after I had sorted out the bugs.

If you want I’m happy to share the code - it’s nothing special, the biggest obstacles were

  • communicating with the workers (mostly solved by passing binary information as a number where each binary position stood for a command (eg 11010010 would have been transmitted as 210) and would have set options (eg isForwardStrand as Boolean) 2,5,7,8 to TRUE (as binary is evaluated right to left)
  1     1     0    1    0   0   1   0   = 210
  128   64    32   16   8   4   2   1
  • having a worker manager that distributed the jobs to each worker and received notice when one was done
  • giving the start and end position on the chromosome for each worker to read from the sequence file and analyse
    • and getting my head around postGres

P.S. the Bovine genome is about 2.7 Billion basepairs - and I needed to analyse BOTH strands - resulting in 5.4 billion entries in the database. I actually ran out of space on my hard disk at one point - which was my own fault as I had opened a transaction at one point but never closed it, resulting in the error log files taking up over 400 GB by the time I noticed. I still think postGres should have kept an eye on the size of the error log and warned.

2 Likes

Before jumping ship to another DBMS you need to figure out what the bottlenecks really are.

Have you profiled your code to see what operations are taking the most time?

Since you’re using SQLite, I assume this is a single-user application?

How big is the data in the DB? How much DB writing vs reading? If you read the same data over and over, moving it to an in-memory DB can help.

I agree on reusing instead of creating/destroying UI objects (or any objects really).

Are you using a lazy-loading listbox design? Or at least “freezing” the UI while loading data?

1 Like

The user interface is rich and includes multiple listboxes; text fields; and custom graphics elements. I am using a lot of dynamic controls which are created and destroyed at runtime.

I have noticed that the main areas of slowdown occur during database operations and when updating the UI. For example; populating a listbox with data from the database takes a noticeable amount of time; and there are lags during UI updates when multiple elements are changed at once.

1/ Load the database into memory, don’t access it from disc.
2/ Only request the fields you need, don’t ‘select * from’ and cherry pick what come back.
3/ Index any Unique fields which you search/filter upon
4/ Dont load a listbox with 16 thousand records. (or whatever) Look about for lazy load listboxes or do the same yourself.
5/ Dynamic controls created and destroyed is bad. Create them all upfront, make them visible / invisible by all means. Or re-use them. (How, is hard to say without sight of the UI)
6/ “significant amount of data processing” - offload calculations such as sums, counts, groups etc to the database - dont use a loop in your code to do that.
7/ If entering value X affects total Y, you might update a pre-computed Y every time you save a new value of X, rather than adding up all Xs and then updating Y, or returning that in a query. (Using that technique on a business app I had cause to review in 1994, I managed to get one part of the app to return values within a couple of minutes, where the previous version literally took 4 days to complete. I won’t name the app because I remain fond of the place where I did the work)

2 Likes

One of the red flags I look at in DB applications is a Looping Recordset that then in turn calls another SQL statement of some sort. That is almost always a place that can be simplified.

Use Views to get the EXACT data you want is better than trying it piece it together with multiple queries.

If you have a Select * in a query you can probably make it more efficient by just bringing back the columns you really need.

Instead of dynamically creating UI I would look at make container controls that can easily be made visible/invisible. Create them once and run set their visibility as needed. I find containers to be one of the more power parts of Xojo that people don’t take advantage of.

If you have reference tables that you are querying a lot to get an ID (or name or whatever) maybe store the data in a Dictionary. If you’ve already looked it up then use the results from the previous query. So in the long run the longer it runs your app will do less queries to get data. It will use more ram, though. So a tradeoff.

3 Likes

That’s a dumb response because it’s never that simple and you know it. Rewriting is not alway feasible. Like many projects the deadlines are sometimes ‘last week’ so you have to kick the rewrite can down the road. So that response is not helpful IN THIS MOMENT.

Yeah, we get it. You love Java. Hate Xojo. How about helping what the OP asked for instead of proselytizing? It’s not helpful.

6 Likes

I have to agree with Bob.
‘Do it in Java’ is not an answer to someone who is ‘doing it in Xojo’
I personally would not consider rewriting a complex app in Java for one millisecond.

We all get that you love Java, and I won’t argue the benefits - but most of the techniques that besossen uses to optimise this are valid in any language.

3 Likes

There is a problem with the Xojo Mysql and the Xojo sqlite database connector. As far as I know the monkeybread one is better so I would like to recommend the monkeybread one.

As the Xojo String handling is sadly slow by design it may be a problem to handle that much faster. Only advice I can give is to take care that you handle the data from db with care. While String handling is slow you can not do so much to bring it to a realistic speed.

As far as I saw the last Xojo releases are much faster than the older ones. Am I allowed to ask which release you are using?

As the UI also has to deal with String handling especially when populating lists and tables you may have a problem with no real workaround. I would strongly recommend to use the last Xojo version for optimum speed. Sadly you can not setup functionality in another CPU Thread. The problem with the GUI is a real old one, I wrote about that before long time already. Possibly it can be optimizes but not without any knowledge of the code behind especially for the dynamically generated gui components.

I second the suggestion to switch databases.

A LOT can be offloaded to the right database, and even something as garbage as access can look like a rockstar when connected to an external database.

You can then use views and stored procedures or their equivalents to move complex stuff off to the server, and your application can serve as a viewer, preferably for specific views that send only the final information you want presented.

SQL lite is great for many things, including prototyping and planning. I don’t think it’s made for anything super heavy like what you are doing. And Xojo is already slow with some things to begin with, so offloading that will help a lot.

SQLIte can be darned quick
And since each connection is isolated from others in a truly threaded app it can perform like a champ despite everything
But Xojo’s threads are cooperative so that wont help much if any
Helper apps (possibly even Xojo workers) that connect to the same DB might help since they are separate processes
Transactions may be required and they can have some impact on concurrency
Coordinating everything will be the challenge

But even then SQLITE with Xojo is extremely slow. It can be darned quick, yes. But what I saw as performance with Xojo SQLITE interface was not as good as it sould be. Better was the Monkeybread variant, maybe it would worth to take a look on it?

The problem of String handling which Xojo definitely has you can make only faster when and if you reduce all activities Xojo has to do by self. So I guess Norman is right with his Idea of transactions.

the MBS variant certainly has some advantages
Cant hurt to look there

Xojo threads are cooperative so you use only a single core → useless.

SQLite isn’t truly multi-user so separate workers aren’t really independent of each other → useless

I’m not convinced that SQLite CAN’T do it - it is pretty fast. The question is if his workflow requires MULTI-CORE or can be run on a single core. An alternative might be CubeSQL which is based on SQLite and provides a manager that can deal with the workers … but it costs, while PostGres is free.

1 Like

I’ve done it just fine with many apps writing to the same db
Not using Xojo workers though
Separate spawned instances of a Xojo console app all writing to the same db
As many as a few hundred at once

The metric missing here is how many inserts per second that resulted in. Without that it doesn’t mean much.

From SQLite:

Actually, SQLite will easily do 50,000 or more INSERT statements per second on an average desktop computer. But it will only do a few dozen transactions per second. Transaction speed is limited by the rotational speed of your disk drive.

Nowadays that limit should be higher due to the speed of SSDs.

To put that into context. My workers wrote the results (100,000 inserts) into PostGres in transactions of 1,000 inserts, so at 50,000 inserts per second that was 50 transactions per second. With SSDs SQLite should be able handle that.

1 Like

the inserts could be batched using stored procedures, and written as one transaction though. that would definitely increase the speed. Esp. if xojo didn’t have to do the batching itself.

Sorry, I’ve usually poopoo’d sql lite, so i didn’t realize it supported SP’s until now. I’ve used it mostly for prototyping and switching to another database after. :smiley:

I was using it to load each text file of a project into a database table
I spawned 700 instances and it loaded the entire project in a matter of a couple seconds
As for “inserts per second” I never measured that because TPCC wasnt my goal

Trying the same thing with workers crashed spectacularly
And a single threaded test was about as slow as the IDE is to load a large project

The speed up was orders of magnitude faster

The entire problem, the bottleneck is the DB interface and the String handling of Xojo. Both really slow. Comparing it with others it is definitely a problem to speed the connection up to an industrial standard speed.

1 Like