Skip to content

LSLforge improperly handling hexadecimal integers #13

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
RayZopf opened this issue Feb 26, 2016 · 4 comments
Closed

LSLforge improperly handling hexadecimal integers #13

RayZopf opened this issue Feb 26, 2016 · 4 comments
Labels

Comments

@RayZopf
Copy link

RayZopf commented Feb 26, 2016

https://code.google.com/archive/p/lslforge/issues/44
Posted on Apr 29, 2014 by Grumpy Hippo
Given a hexadecimal value in your LSL scripts, LSLforge will automatically convert it to an integer value. However, this is bugging up some code I was testing.

Namely, the hex value 0x80000000 is being converted to 2147483648, but when entered into a script in SL or InWorldz, such as llOwnerSay((string)0x80000000); the integer value reported is -2147483648.

This is with the 64bit Linux version of LSLforge. A friend of mine suspects that LSLforge (or maybe Eclipse) is converting it to a 64bit integer, when it should be 32bit.

Comment 1

Posted on May 10, 2014 by Swift Horse
In my environment, it makes -2147483648 because my CPU is still 32bit... Line 62 of Type.hs,

data LSLValue a = IVal Int | FVal a | SVal String | VVal a a a

I guess you can fix it to replace Int to Int32, but I can't confirm it...

Comment 2

Posted on May 16, 2014 by Happy Wombat
no issue on Win7-64 with Version: Kepler Service Release 2 Build id: 20140224-0627

java.version=1.6.0_45 java.vm.info=mixed mode java.vm.name=Java HotSpot(TM) 64-Bit Server VM java.vm.specification.name=Java Virtual Machine Specification java.vm.specification.vendor=Sun Microsystems Inc. java.vm.specification.version=1.0 java.vm.vendor=Sun Microsystems Inc. java.vm.version=20.45-b01

using 32bit LSLForge win executable from my own repo https://github.com/RayZopf/LSLForge_patched/blob/master/lslforge/haskell/dist/build/LSLForge/LSLForge.exe compiled with 32bit GHC, http://www.haskell.org/ghc/download_ghc_6_10_4#windows

Attachments
Issue_44-hex_integer-SL.lsl 231
Issue_44-hex_integer-LSLForge.lsl.lslp 219
Issue_44-hex_integer-LSLForge.lsl.lsl 336
Comment 3

Posted on May 16, 2014 by Grumpy Hippo
Yeah, it's not going to show up on 32bit builds, even if you are running them on a 64bit system, because they still execute in 32bit mode. It'll only show up if you use a 64bit LSLforge build on a 64bit system.

Issue_44-hex_integer-SL.lsl.txt
Issue_44-hex_integer-LSLForge.lsl.txt
Issue_44-hex_integer-LSLForge.lslp.txt

@PellSmit
Copy link

In haskell/src/Language/Lsl/Internal/Type.hs,
LSL integer is define like this:

data LSLValue a = IVal Int | FVal a | SVal String | VVal a a a 
               | RVal a a a a | LVal [LSLValue a] | KVal LSLKey
               | VoidVal deriving (Show,Eq,Ord)

In this definitions, IVal has Int value.
Int is 32bit integer on 32bit Haskell, but it is 64bit on 64bit Haskell.

It seems we can fix it replacing IVal Int to IVal Int32.
But I can't try it because we don't have 64bit Haskell 6.10.4 on Mac.
Does anyone try it on 64bit Haskell?

@PellSmit
Copy link

PellSmit commented Sep 19, 2018

I think it was fixed on 0.1.9.4.

@PellSmit
Copy link

It was fixed by commit 5f62382.
I think it's time to close this issue.

@raysilent
Copy link
Owner

Got it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants