[Javascript] Strange problem with getting a <table> tag's .offsetHeight property.

Matt Barton javascript at mattbarton.org
Tue Oct 3 05:39:49 CDT 2006


Hi,

I have a function in my intranet application which over-rides the 
browser's context menu, and creates it's own.  It displays this context 
menu in a popup created with the window.createPopup() method.  It does 
this so that the popup can float over frame boundaries, and even the 
browser window boundaries.

It first shows the popup, fills it with html generated for the menu, 
grabs the offsetHeight property of a containing table, then hides the 
popup, and re-shows it dimensioned correctly for the table it contains. 
  This is because popups created with window.createPopup() cannot be 
resized.

This works fine for all of my users, apart from a couple of new laptops 
that use widescreen displays, and where the users have a larger than 
normal DPI setting on their displays.  In these cases the offsetHeight 
property is returning a value a good deal smaller than it should be.

Two different systems, running the same OS (WinXP, SP2), the same 
browser (IE6), the same DPI (133), one with a widescreen display, and 
one with a regular display, return different values for offsetHeight 
when calculating the height of identical tables.  The only difference I 
can see between the systems is the widescreen-ness of the monitor.

If the user with the widescreen laptop changes his DPI to the standard 
96, it behaves normally, returning values consistent with the regular 
systems.

This is observable on all the widescreen laptops we have.

Anyone got any idea at all?

Matt Barton



More information about the Javascript mailing list