" /> Status for Andrew DeFaria: November 6, 2005 - November 12, 2005 Archives

« October 30, 2005 - November 5, 2005 | Main | November 13, 2005 - November 19, 2005 »

November 12, 2005

Several PQA Fixes

  • Added Exit Sub/Function to several validation and initialization functions in order to allow TransferState to work
  • Fixed some bugs with Fixed_In_SW_Version and OS values of "Novell 6 Pack Beta 3 ". Not sure how this got by before with the trailing space but it raised it's ugly head in TransferStates
  • Change pqamerge to treat the Submit state like the Assigned state
  • With implementation of ID equality, I have to constantly regenerate the database from scratch. Experienced some problems with this and had to change the schema by allowing admin to submit. This may have been a different problem in that I now need to subscribe all users and groups to Cont database and Upgrade the database each time I create it a new
  • Transitioning through states have reveal some fields that where not initially tranfered from the TO database. The fields missed were ResolvedBy, ResolveNote and VerifyNote

Additional Schema Action Hook Changes

In order to set State properly pqamerge needs to transition through states to get to the desired end state. For eample, many defects in Prod are in the Closed state. But when pqamerge first creates the defect in Cont it will be in the Assigned state. In order to get to the Closed state pqamerge need to apply the Resolve, Verify and Close actions. As such it triggers state change action hook code.

For example, in order to transition to the Verified state the Verify action must be applied. When the Verify action happens the action hooks for Verify are run. Part of what they do is initalize owner to blank. It is expected that the user exectuting the Verify action will fill that in. But that messes us up since we are not a person and we already have the "correct" data. Investigating this further reveals other places where, for example, the current date is put into a field such as Resolve_Date and calculations are made in other fields such TimeFromSubmitToResolve. We don't want any of that happening!

To resolve these issues the following Action Hooks have an Exit Sub or Exit Function placed at the beginning of the subroutine or function call so as to avoid the incorrect updating of data fields and so that email is not send out (the other thing many of these Action Hooks do):

  • Submit: Validation
  • Assign: Notification
  • Resolve: Initialization; Validation; Notification
  • Verify: Initialization; Validation; Notification
  • Re-Open: Notification
  • Close: Initialization; Notification
  • Modify: Notification
  • Unassign: Notification
  • DoesNotVerify: Initialization; Validation; Notification
  • VerifiedPendingCustVerify: Notification
  • CustomerVerified: Notification
  • Data_Pending: Notification

I should have just done all of them or perhaps tried to change the Actions table from use Basic script to script None but the above set seems to be working.

Data Issue

Ran the merge and now it's taking 6 hours 54 minutes 39 seconds. This is due to running through the various states to obtain the appropriate state and "burning" IDs so that the IDs match. One remaining problem that I don't know how to fix: Defect Prod00012546 is in the closed state yet lacks any VerifyNote. When transfered to Cont pqamerge tries to go through the states and gets stuck trying to transition this defect from Resolved -> Verify due to the lack of a VerifyNote. I do not know how this happened in the Prod database. The only thing I can think of is that somebody modified Prod00012546 after it passed the Verified state blanking out VerifyNote.

I'm running check_attachments now and I expect that to result is 0 differences in attachments.

vobadm@P4TEST /dev/d/PQA
$ cqperl W:/it_scm/adm/cq/check_attachments -v
Grand total (old): 2955822684
Grand total (new): 2955822684

November 11, 2005

Final PQA fixes

  • Implemented TransferStates
  • Found final problem with attachments and fixed it
  • Implemented ID number equality

November 10, 2005

Finding the missing 261,285,366 bytes

  • Fixed bug in pqamerge that caused some attachments to not transfer

I think I've figured out where that missing data went to. As you know the total of the attachment sizes of the old databases compared to the new database was still off but a relatively substantial amount (261,285,366 bytes - see https://defaria.com/blogs/Status/archives/000471.html#more).

I changed check_attachments to help me find where the missing data was. The basic idea was to read all Cont records, total up the attachment size, then use Cont: old_id to locate the old record and total it. With those two figures I could find which records didn't convert correctly.

Turns out:

 # At this point we don't have any info about whether we are
 # coming from Prod or TO, however, there are the following fields:
 #
 #          TO               Prod              Cont
 # ----------------------- ----------------------- ----------------
 # Attachments        Attachments        Attachments
 # AttachmentsBRCM        AttachmentBRCM        AttachmentsBRCM
 #
 # You may notice that Prod: AttachmentBRCM is missing the "s".
 # Therefore:
 $field_name = "AttachmentsBRCM" if $field_name eq "AttachmentBRCM";

Remerging and will run check_attachments again.

November 9, 2005

Removing DBs from a Schema/Email link concerns

  • Investigated how to remove databases from a CQ Schema
  • Created template files for Clearquest Web Login page to swap in for Phase I and Phase II onto the production web server (under C:\Temp\PQA)
  • Discussed the email link problem with Rational. While they can't say definitively what is supported they have stated that the old email link URL will not work in the new schema
    • Databases contained in a Clearquest Schema Database

      A Clearquest connection profile connects to a Clearquest schema database. This schema database defines many things, one of which is which user databases this schema database covers. So with the old 2001 schema all of Prod, TO and NAS is defined in there. The Clearquest Designer has Delete Database and Undelete Database to remove and re-add databases to the schema. This is a misnomer. The schema database has a table, master_dbs, which lists all the user databases that this schema database knows about. What is really happening when you Delete Database is simply that Clearquest Designer is toggling the is_deleted field in master_dbs for this database to 1 meaning it's "deleted". Undelete database merely toggles it back to 0. Being marked deleted means that it will no longer show up as a database to select for Clearquest Windows Clients and the web server. The web server, however, needs to be restarted to notice the change.

      Another Email URL link issue

      With the concept of having multiple Schema repositories and database you are right to be concern about the user being confused when clicking on an email link. Normally the email link is of the format:

      http://<server>/cqweb/url/default.asp?id=<id>

      If the user is logged into CQ Web then they will go directly to the detail defect. If not they will make a brief stop at the login screen. With multiple Schema repositories the user will have to select the proper schema repository and database during login. However you can add on additional parameters to set the default schema repository and database like so:

      http://<server>/cqweb/url/default.asp?id=<id>&dbset=<dbset>&db=<db>

      For example:

      http://pcsjca-ccrmt03/cqweb/url/default.asp?id=Cont00009460&dbset=2005.02.00&db=Cont

November 8, 2005

Web Server Configuration and documentation

More on attachments

  • Enhanced check_attachments to log better

Well it's better and didn't take as long as I thought. This time it took 4 hours 34 minutes 49 seconds. Not too bad. And the attachments size is closer. The total attachments for Cont is still < the sum of the attachments from Prod and TO:

Prod total attachment size
2,683,569,547
TO total attachment size
272,253,137
Total Prod + TO attachment size
2,955,822,684
Cont total attachment size
2,694,537,318
Difference
261,285,366

I'm not sure how to account for that difference.

November 7, 2005

check_attachments

  • Worked out a plan for how to perform the PQA Merge this weekend
  • Concerned about the size discrepancy in the database I wrote a little utility, check_attachments, to total up the size of all attachments in all the databases. This turned up a bug in pqamerge which I fixed. Reperforming pqamerge
  • There is a problem with using clearprompt to prompt for list input for bin_merge - You can only have one line of prompting text and only 50 characters in that line. It's gonna be hard to describe a binary merge situation in 50 characters or less. Investigated making a PerlTk list dialog. In theory it can be done as ccperl does support PerlTk.
  • Discussed with Naga the email link issue with PQA

TransferAttachments

I knew there was a good reason why this was eating at me. There is indeed a bug. My attachments transfer routine was not getting called at all for Prod! I wrote a small Perl script to check this and it yielded:

$ cqperl check_attachments
Totaling attachments in TO...
Totaling attachments in Prod...
Totaling attachments in Cont...
Total attachment size for TO = 272253137
Total attachment size for Prod = 2683569547
Total attachment size for Cont = 272253137

Notice that the size of the attachments for TO matches the size of attachments for Cont! IOW no Prod attachments got transferred at all!

Turns out I was only calling TransferAttachments when the field name was AttachmentsBRCM (and TransferAttachments then did all the attachments - both Attachments and AttachmentsBRCM). However, in Prod the field is named AttachmentBRCM - note the singularity here! I tell ya my eyes are going.

I'm changing my code not be dependent on the field name and to just call TransferAttachments for each record. TransferAttachments doesn't need to know the field name - it just does all of them.

Unfortunately this means that I have to delete all of the records currently in the Cont database on p4test. And it also means that the approximate 4 hour running time will probably increase. However this does explain the discrepancy in the database sizes.

Out of Disk Space

Shivdutt Jha wrote:
Good job Andrew, another mystery is solved.

What is more telling is that we have 2.6 Gig of attachments!

In any event my merge failed with out of disk space. The pqamerge script uses the current directory to temporarily hold the attachments as the are transfered from one DB to the other. Turns out that Prod00010818 has two very large attachments, one 328485466 bytes and another 219095209 bytes. Unfortunately I was running in ~vobadm/My Documents and that disk filled. So another lesson learned - run pqamerge in a directory on a disk with lots of disk space! Alas this means I have to start this process over again (I'm trying to get a clean run and a timing of how long to expect the merge to run. pqamerge outputs how long it takes to do it work).