C++ :: Turn Binary File Data Into Unsigned Character Array For Inclusion In Executable
Jul 10, 2013
So I wrote a program to turn a binary file's data into an unsigned character array for inclusion in an executable. It works just super.
I'm wondering how I can write a program that will perform this operation on every file in a directory and all it's sub-directories so that I can I can include everything I need all at ounce.
all i want to do is to read a fixed char array sized 4 from user and pass it to Binary File then Print Encrypted content from the the File to the console screen .. but it seems it prints the same input every time .. and i tried everything .. it works fine with integers and strings .. but when it come to char array nothing ..
#include <iostream> #include <fstream> #include <cstring> using namespace std;
I am trying to write down in binary format an array of unsigned int values but i get the following compilation error :
: In function ‘int CIndex(std::fstream&, std::fstream&, std::fstream&, std::fstream&)’: ./src/IndexBuilder/index.cpp:23:26: error: no matching function for call to ‘std::basic_fstream<char>::write(int*, long unsigned int)’ ./src/IndexBuilder/index.cpp:23:26: note: candidate is: /usr/include/c++/4.6/bits/ostream.tcc:184:5: note: std::basic_ostream<_CharT, _Traits>& std::basic_ostream<_CharT, _Traits>::write(const _CharT*, std::streamsize) [with _CharT = char, _Traits = std::char_traits<char>, std::streamsize = long int]
This is the part the is not working:
Code: // uia is : unsigned int * uia; // then I have allocated the space for it // load it with unsigned int's // k is the number of variables in my array
o.write(uia,sizeof(unsigned int)*k); But thsi should be so simple and strait forward.... in c i do it as :
Code: fwrite(uia, sizeof(unsigned int), k , fp); but since i would need to convert fstream to FILE* i decided to do it c++ way.
Double values are stored in text file. 23.5 36.8 34.2 ... My teacher told me to read them character by character and then make words, like i have to read "2" "3" "." "5" and now have to make it or treat it as word and then using atoi(). I have to convert it into double. but i dont know how to do this....
I have one requirement. I have a project lets Say baseProject.vcxproj. It has some header files. Another project lets say dependentProject.vcxproj loads baseProjects's dll and uses some of its header files.
When some other project lets say unrelatedProejct includes the header file from e dependentproject which includes baseproject's header file. It makes to change the include driectory setting of unrelatedproject. How to avoid this.?
Following is the program I wrote it basically takes 9 inputs and then save them into binary file. then print out the data stored in binary data and find inverse of it then print the inverse out. but its stuck in a loop somewhere.
Code: #include <stdio.h> int main() { int a[3][3],i,j; float determinant=0; int x; FILE *fp = fopen ("file.bin", "wb");
How would one cycle through a turn order in a turn-based game? I was thinking an array of every creature (including the player) and have a pointer to the array++ after the turn, but I couldn't put all the objects into an array.
I'm facing a problem regarding data entry in file.I'm making arrays which terminates when I press enter key but problem is that character at 0 index is not in file while rest of the indexes are there .. In other words,while writing on file my first character of any array got missed and did'nt present in the file ..
I'm trying to read a file that is in byte format then append it onto another file. I'm doing this with unsigned char variable types because they're always one byte. Since the format is simply using bytes, they don't care about the character representation. However, when I read the characters in then put them out again, the '/n' character is always preceded by the '/r' character. In hexadecimal this looks like 0D0A. I have no control of this, and it seems as if it's being done automatically by the ofstream.put() function.
So, is there a way to take away this appending of characters and simply writing the raw data to the file?
This program has to convert an unsigned binary number into a decimal number. No matter what binary number I enter, however, it always outputs that the decimal number is 0.
My code is as follows:
#include <iostream> #include <cmath> #include <algorithm> using namespace std; int main() { string binarynumber; cout << "Enter an unsigned binary number up to 32 bits." << endl;
[Code] ....
And my output:
Enter an unsigned binary number up to 32 bits. 00001111 That number in decimal is 0
The output should have shown the binary number in decimal to be 15, and I cannot find my error.
Say you the user inputs x number of names and then is to put in x amount of values for each name. How would you display these values in a 2d array and be able to add the values for each row which will represent each name?
int main () { string integer1; string integer2; cout <<" enter your first number: " << endl; cin >> integer1; cout << endl; cout << integer1 << " is your first number" << endl; }
Now how do I turn the string integer into an array?
I know strings are essentially just arrays of characters, so what would be the easiest way to take each individual digit and put it into a separate space in an array?
I'm not the best at C but I'm trying to write a C function that basically opens a text file with assembler language does a syntax error check on it and then converts the binary data into hex.
This is my code so far:
Code:
#include <stdio.h> #include <string.h> int main(void) { FILE*fname; char prompt; char filename[15]; char text[100]; printf( "Please enter the name of the file you wish to open: " );
I a want to write a code to convert a string into binary data for that i wrote a code its working perfectly but there is one problem , some of the binary data is written in 7bit and i want to convert it to 8 bit by adding 0 to the last.
#include <iostream> #include <fstream> #include <string> using namespace std;
I am trying to encrypt a char Array with binary data.
I think I understand the basic of Encryption / Decryption but I fail to see how to implement something.
I am trying to have a "key" that needs entered so the data can become readable executable. The program I am encrypting is a small console window with a message with the text "A secret message from your friend" (Not that it matters).
I have the binary data witch I can copy and what not. But how go about Encryption and then decrypt it and not destroying the data.
I develop a software using QT 5 open source IDE. Now my question is two-fold:
1. How can I create the final executable file that I can upload for my users? I understand that runtime DLLs shall be required and I have tried Enigma Virtual Box software for bundling runtime files. It does create the file that I can execute from any folder in my PC. However, surprisingly when I transfer that "boxed" file to another PC, it does not run. Both the PCs have Windows 7 installed on them.
2. Secondly, I see possible future issues with Antivirus Softwares. Apparently when I try to run the boxed exe file, it gets rejected by the Antivirus Software on my PC. Is there a way in which I can get my exe file verified/checked/registered by the Antivirus Softwares so that my users don't face any problems in executing the program.
I cannot afford the QT commercial licence, but I am prepared to buy any economical "setup file generating" software (if it exists).
I am writing a program which compresses files into .zip files.
Here's my problem: Whenever I want to compress an executable file, my readFile function does not read the entire file. When I extract the .exe I get a very tiny and incomplete file.
Here's the function I use to read files:
std::string miniz_wrapper::readFile(FILE* f, int MAX_FILEBUFFER) //MAX_FILEBUFFER has a default value of 65536 { char* tmp; std::string tmp_s; int count = 0;
[Code] .....
Prior to reading, every file is opened using fopen with the mode "rb".
Im writing a scientific software where I like to sent a 2D array (5x4) over a named pipe from a server to a client. When im sending a static array (i.e., double res[5][4];), all goes fine and it works perfect, but when I allocate a dynamic array, it provides some nonsense numbers at the client side. I feel it might be caused because I point to a memory that cannot be shared through a pipe. Am I right and how can I pass the dynamic allocated array itself over the pipe.
//Server program
// Create a pipe to send/receive data HANDLE pipe = CreateNamedPipe( "\.pipemy_pipe", // name of the pipe PIPE_ACCESS_DUPLEX, // 2-way pipe -- send and read PIPE_TYPE_BYTE, // send data as a byte stream 1, // only allow 1 instance of this pipe 0, // no outbound buffer
I am having problems either writing data to a binary file or reading from the file. Through the process of elimination I am posting the code where the data is written to file to see if I can eliminate that as an option. I know the data is being processed correctly because, through the use of another function, I can view the data.
I also know that fwrite must be including some padding because the file size ends up being 576 bytes after it is written instead of 540 bytes (the size it would be if no padding is used). Here is my struct:
Code:
typedef struct { char teams[25]; float wins; float losses; float pct; int runsScored; int runsAgainst; } STATISTICS;
I found the following code in [URL] ....., that send .TXT files perfectly to php script in my server using Wininet, but when I insert a .BMP file, this file (.BMP) is correctly created and named in server side, but it is empty! I read that is necessary implement base64 encode for work properly, so how would?
I'm making a binary file that has 100 "empty spaces", and then I want to rewrite specific place with info, however it always writes the info at the end of the file, no matter what I try to get position before I call write() it tells me correct position...