I am trying to convert some chars to UTF-8 strings...
Example:
std::string gethex(char c) { /* EXAMPLE if (c == 'é') return "%c3%a9"; I need a function that converts chars like "á, é, í, ã" to UTF-8 hexadecimal strings... */ }
[Code] .....
[URL] .... does it. Choose UTF-8, type some character and click 'Url Encode'.
As a part of a program I am supposed to write, I would like to receive a string from the user (for example: "Hi my name is Joe").
obviously, the string is inserted to an array of chars (arr[0]='H', arr[1]='i', arr[2]=' ',... and so on).
What I would like to do, is to put each word separately in each array cell (for example arr[0]='Hi', arr[1]="my"..., and so on). How can I do this? (I can not use any functions, unless I write them myself).
I am trying to create a program that reads my file filled with random words, it then compares the words after they are put into a 2d array and sees if there is any matching words.. unfortunately the count is not working for me (in function2 and function3) and I am not sure why..
Code:
#include<stdio.h> #include<string.h> char function1(char words_array[][17]); int function2(char words_array[][17]); void function3(int pairs, char words_array[][17]); int main( void ) { char words_array[20][17]; int x = 0;
from k&r book (4.2 - functions retuning non integers) (doesnt compile - 1 error)
Code:
#include <stdio.h> #define MAXLINE 100 /* rudimentary calculator */ main() { double sum, atof(char []); char line[MAXLINE]; int getline(char line[], int max); sum = 0; while (getline(line, MAXLINE) > 0) printf(" %g ", sum += atof(line)); return 0; }
Code:
int getline(char line[], int max); is no different from:
Code:
getline(char line[], int max); ?
so, the int (return type) is optional when using a function. i assume return type is casted double? why the (char [])? just weird cuz i never seen it before
I've Been working on a program that acts as a form of Roman numeral calculator, I input Roman Numeral Characters, and the program (is suppose to) output the corresponding digits. *Not allowed to use for loops or arrays.
Input: MCCXXVI LXVIIII +
Output: The First number is 1226 The second number is 69 Arithmetic operation is + the sum of 1226 and 69 is MCCLXXXXV (1295)
However, when I run the program:
input: MCCXXVI LXVIIII +
Output: The first number is 77 The second number is 76 Arithmetic Operation is + The sum of 77 and 76 is (infinite loop of I's) (153)
I noticed that when I input MCCXXVI, it only takes the first character (I thought cin.get() was suppose to stop this?), and returns the ASCII decimal value of that, instead of the integer value that I assigned to each letter. Why i get an infinite loop, and how to fix it.
#include <iostream> #include <iomanip> #include <string> #include <cmath> using namespace std; const int I_Value = 1; const int V_Value = 5;
You have to expect the following input : an arbitrary amount of lines, each line consists of 5 int32 numbers. The full input will be terminated by an EOF.
E.g.:
1 2 3 4 5 6 7 8 9 0 ...
You`re then supposed to convert the numbers to integers and do some calculations. I would know how to parse a single line of 5 numbers via scanf(). That`s easy, and that`s exactly what they did in class.
But how do i go about splitting the lines? What about the EOF? Even if could hack something together, by using errno or something, it would be way beyond what they are doing atm. The input is received via user input, ie stdin.
I have a problem set where i have to read in numbers from a file as strings, convert from strings to integers, and pass the integers into a linked list, where each integer is a node. This is what I have so far:
Code: # include <stdio.h> # include <stdlib.h> # define MAX_INT_SIZE 10000 typedef struct integer BigInt; struct integer {
I am working on a number of utility functions for two dimensional arrays of integers, or matrices. However I am having a problem with segmentation faults, most likely due to errors in using malloc in functions like copyMatrix.
Code: matrix_utils.h~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ //This function checks if two matrices are equal int isEqual(int **A, int **B, int n);
//This function returns one row of a matrix int * getRow(int **A, int i, int n);
//This function returns one column of a matrix int * getCol(int **A, int j, int n);
The code is supposed to convert characters from an array into their respective ascii integers, and append a 0 if the number is less than 3 digits long. It then supposed to put it all together into one string.
Code: #include <stdio.h> #include <string.h> #include <stdlib.h> int main(void){
char file[] = "This is a test"; char *ptr = file; int length = strlen(file); int i, numbers[length];
I know this is pretty basic script the problem I have is that I have to use functions for each calculation. The problem I face is that when the user types in the amount of quarters, dimes, nickels, and pennies when a function tries to add them together it displays a huge number instead of the right number.
Here is my code:
#include <stdio.h> void insert_money(int p, int n, int d, int q, int total){ printf("How much of each coin do you want to insert: "); printf("Quarters: "); scanf("%d", &q);
Working in Win32 console app (VS 2010) I have been trying to convert several Unicode (UTF-16) C++ functions to Ansi C (UTF-8). The test app includes two tokenizer classes, each of which work perfectly well in their respective environments, CTokA and CTokW (UTF-8 and UTF-16).
A problem arises when I attempt to run the UTF-8 functions when the Character Set properties is set to 'Use Unicode Character Set' in that std::string manipulations do not perform as expected, e.g.,
printf("start ");
gets reproduced as
printf("start ");══════════ ²²²²
Attempting to null terminate the string where it is supposed to end simply results in a space in that position and the garbage end persists, e.g.,
printf("sta t ");══════════ ²²²²
Code: sline[11] = 0x0000;
If I attempt to change the Character Set property to 'Use Multibyte Character Set' or 'Not Set', the app will not compile and hundreds of errors occur. Of course, I can eliminate all of the UTF-16 code, but it strikes me that it should not be necessary. Perhaps if M$ made everything UTF-16 without all of the necessary decorations like 'L' and '_T(', life would be much simpler. Unfortunately, I have a very extensive UTF-8 app under 10 years of development that works quite well, but my UTF-16 (Unicode) conversion doesnt work as well because of the mixing of pointers (I think), so I have had to revert much of the code back to UTF-8. (All of which has nothing to do with my question but is simply psychotheraputic for me to ventilate on.)
My question is this: Can UTF-8 and UTF-16 code coexist in a single Win32 console app?
I was just wondering how to deal with overflow. My code works for exponential when i put small numbers in like 2 and 3. So it would do 2^3 which would be 8. But if i try something like 2^44 then I just get 0.
I have some header files with generic functions. And in one of them, I define TRUE and FALSE. Recently I started using another header file and it defined TRUE and FALSE as well. That caused GCC to throw an error, and not compile. In my case I can put #IFNDEF around the values so that if it is already defined they don't get defined again, but is there a better way to handle duplicate names? I assume the case would be the same for duplicated functions, how does that get handled?
I've knocked up a rough C parser for the purpose of colourizing code into XHTML/CSS, which makes me feel fancy. However, it doesn't quite handle comments properly. I can't quite work out how to deal with the slashes. Plus I'm sure there are other places that it slips up that don't feature in my simple tests, so here you go:-
My assignment is to create a program to print a deck of 52 cards, shuffle it, deal 2 hands of 5 cards, then print the deck after dealing the 2 hands with the cards that were dealt removed. this is my code so far...
#include <stdio.h> #include <stdlib.h> #include <time.h> #define DECK_SIZE 52 void init_deck( int deck[] , int size ) ; void shuffle_deck( int deck[] , int size ) ;
[Code] ....
I am having trouble dealing 2 random hands and how to print the deck with the cards removed.
I'm a beginner at c++ and I need to write a program that reads a set of integers and then finds and prints the sum of the even and odd integers. The program cannot tell the user how many integers to enter. I need to have separate totals for the even and odd numbers. what would I need to use so that I can read whatever number of values the user inputs and get the sum of even and odd?
I'm looking for a algorithm to search portions of string that have the same caracter. The only possible values are: a,n and g
Char index: 0 1 2 3 4 5 6 7 8 9 RESULT ------------------------------------------ Example 1: a g g g a a 0,4 Example 2: g g g a n n a 0,3 Example 3: a g g g g g g a 0,7 Example 4: g g g g g g g 0,6 Example 5: g g a a g a a a g 0,2,3,5,7,8
I'm working on a homework project, and it requires me to read in a file of chars into an array, and then do stuff with that array.
Anyways, I have the first part written, where I'm just trying to read in my data.txt file, and I thought I had it written well. It compiles, but then it seg faults, and I'm not sure why. I used calloc for the array, but maybe I misused it? Or is it in my EOF statement? I'm still not sure if that's coded correctly. I need to get past this so I can start testing the other parts of my code.
Code: #include <stdio.h> #include <stdlib.h>
int main(int argc, char* argv[] ) { /* Local Declarations*/ int i; int *ptr; char tempc;
I'm writing a school assignment that writes/reads user input into and out of a binary file.
I've gotten the write part to work, but now I need to be able to read that file back in and display it as a string.
I think I should be using fread() and read my file into an array of int's right? But when I try printing out that array I get a whole bunch of numbers that don't match the hex code in my file.
How do I read in a binary file and print it out as a string?