Ben Langhinrichs

Photograph of Ben Langhinrichs

E-mail address - Ben Langhinrichs







Recent posts

Wed 18 Sep 2019

Perils of PDF 5: Data Confusion



Mon 16 Sep 2019

About that email in Notes



Mon 9 Sep 2019

Perils of PDF 4: Missing and obscured data


November, 2019
SMTWTFS
     01 02
03 04 05 06 07 08 09
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30

Search the weblog





























Genii Weblog

When is a bug not worth fixing?

Tue 28 Sep 2004, 01:11 PM



by Ben Langhinrichs

It seems so simple to software users.  There should be no bugs, no limitations, and no difficult workarounds.  I have been a software user for years, and it certainly feels like that to me.

But how does a software vendor decide whether a bug is worth fixing?  That is something I struggle with frequently.  The problem is usually a case where the bug is not encountered by many users, and has a fairly clean workaround, and the fix is likely to cause more trouble than the original bug.

Scenario
An agent was copying tables and then deleting rows from the tables.  The agent was not crashing, but was reporting an odd error:

MIDAS: Unable to save additional deleted PAB definitions.

and the rows were losing justification after the delete.  Now, this sounds like a concrete, repeatable bug, the kind I like to fix right away.  So the customer sent a sample, which included the code:

Set rtchunk = rtitem.DefineChunk("Table *;Row 1-" +Str( FIRST_ROW-1 ))
Call rtchunk.remove

My first reaction is, "Cool!", because I love it when somebody actually uses the more complex variations of chunk definitions.  Unfortunately, this is the problem.  The FIRST_ROW can be 8, and there are as many as 12 columns, so there can be 84 cells deleted in each table, with n unknown number of tables.  Internally, we store paragraph definitions for the separate cells to determine whether they will be re-used later, and the limit is 100, which is almost always plenty.  Just not this time.

Workaround
The workaround is simple for this user, which is to cycle through the tables:

Set rtchunk = rtitem.DefineChunk("Table 1")
Do
   Set rtrow = rtchunk.DefineSubchunk("Row 1-" +Str( FIRST_ROW-1 ))
   Call rtrow.remove
Loop Until Not rtchunk.GetNextTarget

and if that wasn't enough, you could break it down to another loop for each row.  This works, but obviously I wish customers didn't have to worry about this stuff.

Possible solutions
I could just change the limit to 255 instead of 100.  This still would only allow the user 3 tables, but going any higher would add a lot of overhead to deletions.  I could go with a linked list approach, but it is a lot of change for this purpose, since I use parallel arrays.  I could rewrite the logic so that duplicate pab references were not stored multiple times, which is the correct solution, but I worry about the implications I haven't thought of.  So for now, I'll swallow and accept another limitation and document this somehow, and hope that one day soon I'll get tired and fix it properly.  In the meantime, like all the Notes "oddities", this remains a Midas "oddity".  I don't imagine the IBM engineers like leaving them around either, but the risk of change may outweigh the risk of leaving bad enough alone.


Update: A fix has been made (that certainly didn't take long, did it?)  See Fewer bugs, less sleep for details.

Copyright © 2004 Genii Software Ltd.

What has been said:

No documents found