would like to know is it’s possible to remove digits from my int value, for example I have a int value that have 6 digits (example : 123456), can I take the 6 digits value and remove the last 2 digit so it become a 4 digits value (example = 1234). Would that be possible.
One way I’m thinking of is that you convert the integer to string then treat the string as a char array (it is anyway). You’ll just grab the first 4 value of the string and replace the current string. Then parse it back as integer.
Thank you all for the input. @wolfhunter777, yea did a few reading I can used array to do that, but not fully understand how to used array yet, @Eric5h5, wow never knew one of the solution is that simple.
Neat computing tips & tricks: A computer calculates multiplication a lot faster than division. So using *0.01 would process faster than /100. The difference is actually pretty big.
Evidence you should leave these micro-optimizations to the compiler…
Attempting one billion divisions!
Result 1234 took 939798 ticks!
Attempting one billion multiplications!
Result 1234 took 932269 ticks!
Attempting one billion shifts!
Result 1234 took 4974061 ticks!
Press any key to continue . . .
using System;
using System.Diagnostics;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
var timer = new Stopwatch();
var result = 0;
var number = 123456;
Console.WriteLine("Attempting one billion divisions!");
timer.Start();
for (int i = 0; i < 1000000000; i ++)
result = number/100;
timer.Stop();
Console.WriteLine("Result {0} took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset();
Console.WriteLine("Attempting one billion multiplications!");
timer.Start();
for (int i = 0; i < 1000000000; i++)
result = (int)(number * .01);
timer.Stop();
Console.WriteLine("Result {0} took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset();
Console.WriteLine("Attempting one billion shifts!");
var quotient = 0;
var remainder = 0;
timer.Start();
for (int i = 0; i < 1000000000; i++)
{
quotient = (number >> 1) + (number >> 3) + (number >> 6) - (number >> 10) + (number >> 12) + (number >> 13) - (number >> 16);
quotient = quotient + (quotient >> 20);
quotient = quotient >> 6;
remainder = number - ((quotient << 6) + (quotient << 5) + (quotient << 2));
result = quotient + ((remainder + 28) >> 7);
}
timer.Stop();
Console.WriteLine("Result {0} took {1} ticks!", result, timer.ElapsedTicks);
}
}
}
Well, micro-optimization is always possible. Using ‘framework’ posted above;
Attempting one billion divisions!
Took 26412686 ticks!
Attempting one billion multiplications!
Took 23780163 ticks!
Attempting one billion shifts!
Took 31701097 ticks!
Attempting one billion hardcode /10 long!
Took 18764953 ticks!
Attempting one billion hardcode /100 long!
Took 18771787 ticks!
Attempting one billion hardcode /1000 long!
Took 18760335 ticks!
20% quicker, but only divides by set amount. Range limits at;
/10 : 0 <= x <= 1.342.177.280
/100 : 0 <= x <= 1.677.721.599
/1000 : 0 <= x <= 262.143.999
/10000 : 0 <= x <= 1.310.719.999
etc.
Code
using System;
using System.Diagnostics;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
Stopwatch timer = new Stopwatch();
int result = 0;
int number = 123456;
long numberLong = number;
Console.WriteLine("Attempting one billion divisions!");
timer.Start();
for (int i = 0; i < 1000000000; i ++)
result = number/100;
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset();
Console.WriteLine("Attempting one billion multiplications!");
timer.Start();
for (int i = 0; i < 1000000000; i++)
result = (int)(number * .01);
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset();
Console.WriteLine("Attempting one billion shifts!");
var quotient = 0;
var remainder = 0;
timer.Start();
for (int i = 0; i < 1000000000; i++)
{
quotient = (number >> 1) + (number >> 3) + (number >> 6) - (number >> 10) + (number >> 12) + (number >> 13) - (number >> 16);
quotient = quotient + (quotient >> 20);
quotient = quotient >> 6;
remainder = number - ((quotient << 6) + (quotient << 5) + (quotient << 2));
result = quotient + ((remainder + 28) >> 7);
}
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset ();
Console.WriteLine("Attempting one billion hardcode /10 long!");
timer.Start();
for (int i = 0; i < 1000000000; i++)
result = (int)((numberLong * 6871947674) >> 36); // safe from 0 up to and including 1.342.177.280
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset ();
Console.WriteLine("Attempting one billion hardcode /100 long!");
timer.Start();
for (int i = 0; i < 1000000000; i++)
result = (int)((numberLong * 5497558139) >> 39); // safe from 0 up to and including 1.677.721.599
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
timer.Reset ();
Console.WriteLine("Attempting one billion hardcode /1000 long!");
timer.Start();
for (int i = 0; i < 1000000000; i++)
result = (int)((numberLong * 35184372089) >> 45); // safe from 0 up to and including 262.143.999
timer.Stop();
Console.WriteLine("Took {1} ticks!", result, timer.ElapsedTicks);
}
}
}
Hmm… not sure why but I get very different numbers than you
Attempting one billion divisions!
Took 938506 ticks!
Attempting one billion multiplications!
Took 945456 ticks!
Attempting one billion shifts!
Took 6777885 ticks!
Attempting one billion hardcode /10 long!
Took 12609203 ticks!
Attempting one billion hardcode /100 long!
Took 12622182 ticks!
Attempting one billion hardcode /1000 long!
Took 12459758 ticks!
Different processor architecture maybe - another reason to leave the optimizations to the compiler.
Haha, couple of months ago I needed something similar to display an amount of money (ex. 10 million versus 10000000). I took the easy 2 - minute route and made a class that divides by million/billion/trillion thinking I’d implement something else later, haven’t touched it since and works like a charm.
Possibly. I’m using the most recent mono version on linux atm, for compiling and running. On an i5-4670k.
I changed the code a bit so the result is the sum of all calculations, to prevent it from optimizing it away etc. It made the standard divisions as quick as the multiplications, and everything a tiny bit slower.
If I run the mono AOT compiler over it, nothing changes except for the hardcodes becoming an additional 70% quicker. (5M ticks versus plain division 25M and non-AOT 19M)
Ran the same code (minor changes into a monobehaviour) in a unity build, and it’s practically the same as on most recent mono in linux.
I’m mighty interested in knowing what you’re running it on, given the massive difference.
Hmm now that is interesting. I am running on an I7-2600 so I doubt there’s much difference there.
The big one is that I’m on Windows running through the MS compiler that ships with Visual Studio 2013. I’ll try running it through Unity and see what happens.
Here’s my results from Unity. Your hard code method is outperforming but not by a lot. The main takeaway for me is how slow my multiplication/division code runs in Unity’s sandbox as compared to the console application coming out of Visual Studio - it’s 60 times slower! I’m shocked.
Attempting one billion divisions!
Took 60602003 ticks!
Attempting one billion multiplications!
Took 51681038 ticks!
Attempting one billion shifts!
Took 142006032 ticks!
Attempting one billion hardcode /10 long!
Took 48874815 ticks!
Attempting one billion hardcode /100 long!
Took 48955529 ticks!
Attempting one billion hardcode /1000 long!
Took 52623455 ticks!
EDIT: I wasn’t doing this test correctly - I didn’t build my Unity game into an executable. Here is the results from a runtime compiled version. Something very bizarre - the performance of the hardcoded methods actually got worse. The performance of division is still 30 times worse than what’s coming out of VS.
Attempting one billion divisions!
Took 31653434 ticks!
Attempting one billion multiplications!
Took 20700862 ticks!
Attempting one billion shifts!
Took 47634917 ticks!
Attempting one billion hardcode /10 long!
Took 83387255 ticks!
Attempting one billion hardcode /100 long!
Took 83863405 ticks!
Attempting one billion hardcode /1000 long!
Took 84452969 ticks!
…It got slower by building it? What. How is that even possible? Unless you were running something else straining the processor. And I can’t see how a double multiplication would be 4 times quicker than a long multiplication + long shift…
Try a project with the code below, and we’ll be sure. It’s the one that stores all the results in the hope of preventing it from being optimized away of sorts. With that exact code, I got this in the output_log.txt (after cutting crap out):
Attempting one billion divisions!
Result 1234 Took 25164277 ticks!
Attempting one billion multiplications!
Result 1234 Took 25113807 ticks!
Attempting one billion shifts!
Result 1234 Took 33666410 ticks!
Attempting one billion hardcode /10 long!
Result 12345 Took 19513724 ticks!
Attempting one billion hardcode /100 long!
Result 1234 Took 19624001 ticks!
Attempting one billion hardcode /1000 long!
Result 123 Took 19455630 ticks!
Code
using System.Diagnostics;
using UnityEngine;
class Program : MonoBehaviour
{
private void Start ()
{
Stopwatch timer = new Stopwatch();
int number = 123456;
long numberLong = number;
long loops = 1000000000;
long AllResults = 0;
UnityEngine.Debug.LogFormat("Attempting one billion divisions!");
timer.Start();
for (long i = 0; i < loops; i++)
AllResults += number / 100;
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
AllResults = 0;
timer.Reset();
UnityEngine.Debug.LogFormat("Attempting one billion multiplications!");
timer.Start();
for (long i = 0; i < loops; i++)
AllResults += (long)(number * .01);
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
AllResults = 0;
timer.Reset();
UnityEngine.Debug.LogFormat("Attempting one billion shifts!");
var quotient = 0;
var remainder = 0;
timer.Start();
for (long i = 0; i < loops; i++) {
quotient = (number >> 1) + (number >> 3) + (number >> 6) - (number >> 10) + (number >> 12) + (number >> 13) - (number >> 16);
quotient = quotient + (quotient >> 20);
quotient = quotient >> 6;
remainder = number - ((quotient << 6) + (quotient << 5) + (quotient << 2));
AllResults += quotient + ((remainder + 28) >> 7);
}
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
AllResults = 0;
timer.Reset();
UnityEngine.Debug.LogFormat("Attempting one billion hardcode /10 long!");
timer.Start();
for (long i = 0; i < loops; i++)
AllResults += (numberLong * 6871947674) >> 36; // safe from 0 up to and including 1.342.177.280
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
AllResults = 0;
timer.Reset();
UnityEngine.Debug.LogFormat("Attempting one billion hardcode /100 long!");
timer.Start();
for (long i = 0; i < loops; i++)
AllResults += (numberLong * 5497558139) >> 39; // safe from 0 up to and including 1.677.721.599
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
AllResults = 0;
timer.Reset();
UnityEngine.Debug.LogFormat("Attempting one billion hardcode /1000 long!");
timer.Start();
for (long i = 0; i < loops; i++)
AllResults += (numberLong * 35184372089) >> 45; // safe from 0 up to and including 262.143.999
timer.Stop();
UnityEngine.Debug.LogFormat("Result {0} Took {1} ticks!", AllResults / loops, timer.ElapsedTicks);
}
}
I’m really surprised by this. Division is now performing the best. I ran this test a dozen times because the results are so weird but I get approximately the same numbers every time - even after a fresh restart.
Attempting one billion divisions!
Result 1234 Took 61127390 ticks!
Attempting one billion multiplications!
Result 1234 Took 94618417 ticks!
Attempting one billion shifts!
Result 1234 Took 98620894 ticks!
Attempting one billion hardcode /10 long!
Result 12345 Took 122287644 ticks!
Attempting one billion hardcode /100 long!
Result 1234 Took 129082681 ticks!
Attempting one billion hardcode /1000 long!
Result 123 Took 131903149 ticks!
EDIT Results from a Console Application using your code. Apparently there was some optimization being done since my numbers are much slower now. Still, the hardcoded methods underperform and the code in Unity is running about 10 times slower.
Attempting one billion divisions!
Result 1234 Took 6094277 ticks!
Attempting one billion multiplications!
Result 1234 Took 12787509 ticks!
Attempting one billion shifts!
Result 1234 Took 11564522 ticks!
Attempting one billion hardcode /10 long!
Result 12345 Took 15344659 ticks!
Attempting one billion hardcode /100 long!
Result 1234 Took 15310919 ticks!
Attempting one billion hardcode /1000 long!
Result 123 Took 15300337 ticks!
This thread is old … and does not reflect the speed truth of any calculation in this example.
The reason is for (long i = 0; i < loops; i++)as INT64 that takes longer than a AllResults += number / 100;
Something like this give you more real results.
for (long i = 0; i < loops / 1000; i++) {
AllResults += number / 100; // repeated 1
AllResults += number / 100; // repeated 2
AllResults += number / 100; // repeated 3
...
AllResults += number / 100; // repeated 1000
}