Register to post in forums, or Log in to your existing account

 Zuggsoft Home Page  Home
 Discussion Forums  Forums
 Download files  Downloads
 Knowledge base  Documentation
 Zugg's personal discussions  Zugg's Blog

 zMUD Speed Benchmarks 

I've decided to stop benchmarking other MUD clients, and to focus just on zMUD.  I've learned that comparing MUD clients is often an "apples vs oranges" comparison.  Usually the person performing the benchmark is also a MUD client developer, and naturally wants their own client to look good.  Everyone claims that their MUD client is "best" or "fastest."  Since each client does things in a different way, it's always possible to come up with a benchmark that makes any specific client look the best.  So, I'm not even going to try.  Perhaps someday, someone independent will come up with a good set of benchmarks (if you do this, let me know and I'll add a link here, no matter how well zMUD does in your results).  For those still interested in benchmarks done by MUD client authors, I've put some links at the end of this document (including a link to my previous benchmark page).

Now, on to the zMUD benchmarks.  Each time there is a dramatic change in the speed of a new zMUD version, or a new public version, I'll try to add it to this table.


1 2 3 4 5 6
v6.57 4.5 4.4 4.8 5.1 8.9 4.1
v6.25 5.4 4.9 5.5 5.6 10.6 4.4
v6.16 7.4 9.0 9.3 9.4 12.0 55.2
v5.55 6.7 8.0 8.3 8.6 23.0 9.7
v4.62 12.5 13.6 14.4 14.9 39.6 16.2

(Table last updated: 8-Apr-2003)

Test results are in seconds.  Best times are in bold red.  The specific tests are:

Tests were done with a 300 Mhz client computer running Windows 2000.  Server was a 300 Mhz computer running Caldera Open Linux, acting as a simple telnet server.  LAN connection was 100Mps.  Window size for client was 100 columns by 32 rows, with a 10pt Courier font.

  1. cat /etc/termcap.  Tests plain text scrolling speed.  No triggers, no parsing...just raw text speed
  2. cat high.txt.  A large file containing a large number of ANSI color changes.  Tests the speed of ANSI control code parsing.
  3. cat high.txt and switch client font from Courier to Arial.  Tests the speed of the client with a font that does not have fixed character spacing
  4. cat high.txt with the following triggers: xxxx0000, aaaa1111, bbbb2222, cccc3333, dddd4444, eeee5555, ffff6666, gggg7777, hhhh8888, iiii9999.  These 10 triggers have patterns that are never matched by any line in high.txt.  So, this tests the trigger processor's ability to quickly discard triggers that don't match. (back to Courier font)
  5. cat high.txt with the triggers from test 4, along with the new triggers: a, b, c, d, e.  Each trigger has a command of {a=1} to perform a simple variable assignment.  Since each trigger pattern is a single letter, these triggers fire on most every line from the MUD.  Tests the trigger processor's speed when matching lines and executing scripts
  6. #loop 1,10000 {a=%i}  tests the raw speed of the scripting language.

Client notes:

  • v3.62: Can no longer run a 16-bit program on Windows 2000, so this version could not be tested.
  • v6.25: Added significant optimizations to increase speed in all areas, especially in raw script speed (test 6).

Other benchmarks:

Previous Zugg Software benchmark of different MUD clients
MUSHClient's benchmark of various MUD clients.

This page was last updated on .


© 2009 Zugg Software. Hosted by