adzscott Posted June 8, 2016 Share Posted June 8, 2016 I'm having trouble expressing actor values (Perception and luck in this case) as integers. The endgoal is to multiply the players luck and perception levels to create a third integer (PL). Any attempt to express actor values as integers within a perk-fragment (see below two examples) spits out the "type mismatch while assigning to a int (cast missing or types unrelated)" error. int PL = (Game.GetPerceptionAV())*(Game.GetLuckAV()) i int P = Game.getPerceptionAV()Any idea what's causing this? Link to comment Share on other sites More sharing options...
KataPUMB Posted June 8, 2016 Share Posted June 8, 2016 (edited) use getvalue() but it returns a float that idk if you can equal it to an int and then it does an automatic cast or you have to type it (variable as int). Edited June 8, 2016 by KataPUMB Link to comment Share on other sites More sharing options...
FrankFamily Posted June 8, 2016 Share Posted June 8, 2016 (edited) GetLuckAV() is like Getplayer(), it returns the actor value itself so you can point to it (instead of making a property), it does not return the value of the actor value. Hence the type mismatch. Floats are auto cast to int, but you can add "as Int" after the getvalue to make sure, so this should work: Int PL = (GetValue(Game.GetPerceptionAV()) * GetValue(Game.GetLuckAV())) as Int Edited June 8, 2016 by FrankFamily Link to comment Share on other sites More sharing options...
Recommended Posts