Patrick B. answered 01/15/19
Math and computer tutor/teacher
1 terabyte = 1 trillion bytes = 1000000000000 = 10^12
1 gigabyte = 1 billion bytes = 1000000000 = 10^9
1 megabyte = 1 million bytes = 1000000 = 10^6
1 terabyte = 1000 gigabytes
1 terabyte = 1000000 megabytes
1 byte = 8 bits, so 1 terabyte = 8 trillion bits = 8x10^12