Same as before: Never add offsets to time instances like this. It will not work as you expect.
I don't want to be stubborn, but I don't understand why the given offset calculation would be wrong. The core library import java.util.SimpleTimeZone library. Oracle says:
SimpleTimeZone is a concrete subclass of TimeZone that represents a time zone for use with a Gregorian calendar. The class holds an offset from GMT, called raw offset, and start and end rules for a daylight saving time schedule. Since it only holds single values for each, it cannot handle historical changes in the offset from GMT and the daylight saving schedule, except that the setStartYear method can specify the year when the daylight saving time schedule starts in effect.
The program asks SimpleTimeZone library in the core lib for the offset in hours from the given local time. Then the given time is converted to the number of ticks. From this, the number of difference hours ticks is calculated and subtracted. Then the result of remaining ticks is converted to GMT time.
If we limit ourselves to the calculation method and taking into account the absence of the exceptions mentioned by Oracle, I do not see what would be wrong with calculating the time difference between two time zones.
I learned during my time maintaining laboratory instruments that a result must be verified with a second source, where the outcome of that second measurement/calculation is based on a different measurement method.
A second source is
timezones with the same result as the program:
Edit: P.S. Perhaps I should have provided some background information on the program to clarify it.