I have following code to create histogram, but it gave wrong output. In the program input_vector read 100 double numbers. I want to create a histogram with bin size=5. Output is [0;0;0;0;0].
I have a text file which consist of several values. These values represent distance from origin to any point in the triangle. How can I create histogram from these values? Is there any functions in OpenCV which can create histogram from these values? Below my sample values are given:
comparing with screen size the height is bigger but lenght is smaller. I don't understand.
I can understand that different printers process the fonts in different way and then to have different lenghts. That's not the problem. The problem is I need to simulate in screen the same behaviour i will have on printer because these texts are being aligned in the document, and I don't want to see that the text si aligned different in text than in paper.
What can I do to render the text on screen with the same size I will have on the printer? Print preview is doing it. Should I change the font parameters? is something related with pixels per inch?
Im working on a small code and am trying to limit the size of the mysql databse string its pulling.
It can only be 119 characters, or less if its more i would like to do nothing, but if its meets the requirements it runs the script.
Code: int32 message_id; string_t message ="Is requesting some one to respond.";<_______________TEMP SHOULD BE THE POSTERS MESSAGE string_t username = "Guest";<_______________TEMP SHOULD BE THE POSTERS NAME // char will not be logged in so get the id manually
[Code] ....
So here is where I'm heaving the problem
if(message is less then 119 characters run script )<<___________THIS IS THE CODE LINE IM TRYING TO LEARN { char buf[110]; sprintf(buf,"[Web Chat] %s %s",username.c_str(), message.c_str());
I am using Visual C++ to write an app. I write CMyObject class and may allocate a lot of instances of CMyObject object, so I want to reduce the size of CMyObject class.
What I can figure out to do is:
1. I can use the following code to get the accurate size used by a single instance of CMyObject object:
CMyObject Object;
//Get the size of memory allocated for CMyObject object int nSize = sizeof(Object);
is that correct?
2.To reduce the size of CMyObject, I have several ideas:
(1)Change member function to static member function if it is possible, since each member function will take some spaces in the instance of CMyObject.
(2)Change virtual member function to member function if it is possible, since virtual member function may take more spaces.
(3)Eliminate unnecessary member variables to reduce spaces.
(4)Finally, if (1), (2) and (3) does not work well, then change CMyObject from class to a struct that only contains some member variables, thus will eliminate the spaces allocated for constructor and destructor of a class.
I'm able to get the graph, but now getting the border on the other side of the graph to align it with the right side of my screen.The output looks like that right now because I'm currently playing around with my y axis (right under the for loop) trying to scale everything. I've tried everything many different ways to get this to work.getting the border on the other side of the graph.
Code:
#include<stdio.h> #include<Windows.h> int main(void) { int MAX = 0; //initialize and declare variables int allcounts [10] = {0}; //store an array of integers for }
In a MDI-app with some toolbars and statusbar I created also a controlbar which can be docked to the left side. My problem now: how do I get the exact height of the controlbar?
In CalcDynamicLayout I set the height for the docked state to the height of the mainframe. This value is too big, but it will work basically.
But how can I get the exact height of the controlbar?
I need use GetTextExtent and I don't understant why GetTextExtent always return the same value if I change some values of the selected font. This is my example:
An ImageList_Create() function (see here) takes 2 parameters: cx and cy which are the width and height of each image.
Everything is good if I know in advance what size my images will have. But what if I don't?
Let's say I select 32x32 and my images are sized as 16x16. What will happen with the image list? Or my images are 48x48 and the image list should grow to accomodate the extra space. Since on Windows image list is just one big bitmap, will the height of the image list shrink/grow to 16/48 or not? Is there a way to test such behavior?
The problem I'm having is to check whether the actual images will be truncated when they are bigger that the image list initial size or not or whether I will see some extra space if the images are smaller.
The closest way I see is this function, but I am not sure its the right one to apply to the image list itself and not to the image inside the list.
How to test this properly and reliably on Windows XP+?
I have created an exam environment for our schools, it comprises of 3 files; a .kix file that says if they are in the examination group run a .vbs file. The .vbs file kills the explorer.exe task so they don't have any taskbar or desktop shortcuts and then opens a .hta file.
The .hta file is a user interface that has icons for apps like Win Word that executes the application when you click on it. There is also a log off button.
Looking for some code that stops the students from being able to close, minimise and resize the window. Looking for code that can lock this down so they literally can't do anything except click the icons inside the window.
I have to rearrange my controls in a controlbar based on the new size resulting from dock/undocking of the bars or resizing of the mainframe.
Is there a message I can use? How to determine the new height when docked?
I tried to use its OnSize-function. The problem with it is to get the new height. When the function is called, the height I get from GetWindowRect is the old height.
I'm trying to code one of the problems but it really is hard to catch errors for a novice like me at first.
I have to Write a single C function for computing the histogram of a list of nonnegative integers into 4 bins.
The main() function first initializes a positive integer array called List of size N, takes 3 inputs from the user A, B, C (assume 0 < A < B < C), and declares a second integer array Bin[4].
The doBinning function should count the number of elements of List in the interval [0, A) and store it in Bin[0], count of [A, B) in Bin[1], count of elements in [B,C) in Bin[2], and the number of elements >= C in Bin[3]
Code: #include <stdio.h> #define N 10 #define M 4 int *doBinning (int source[], int dest[], int a, int b, int c); int main (){
[Code] ....
Somehow I keep getting error from ptr = doBinning(List[], Bins[]. What am I doing wrong? Code might contain some errors.
Scaling the yaxis on my histogram for one of my class projects. I've gotten mostly everything but stuck on scaling. I'm pretty sure it's something simple to do, but I'm having trouble, and I've tried everything to my knowledge to get it scaled from 0.1 - 1, like this:
Here is my code:
#include<stdio.h> //include all preprocessor directives #include<Windows.h> int main(void) { int MAX = 0; //initialize and declare variables int allcounts [10] = {0}; //store an array of integers for int yaxis, xaxis = 0;
[Code] ....
I'm close, but every time I try and change my y axis in the for loop from 1 going down to 0.1, it doesn't give me a right output.
I have a text file which consist of several values. These values represent distance from origin to any point in the triangle. How can I create histogram from these values? Is there any functions in OpenCV which can create histogram from these values?
Below my sample values are given: .......................... 3.4 1.2 6.9 0.2 5.1 2.9 ........................
Write a function that generates 1000 normally distributed (Gaussian Probability Distribution) random numbers. Range should be between -3 and +3. Numbers should be double floating point.
There's more to it than that, but I've got it from there.