• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Basic programming issues...

RapidSnail

Diamond Member
I am in an intro CSE class for C++, and this weeks lab assignment (which has already been submitted) was to develop a basic program that could determine whether a number input from the user was pos/neg or a 5-mult/not.

The theme was if and if/else statements which I understand and did. However, one part of my code bothered me by producing a strange output. It was not part of the assignment to test for this specific bug, but I want to know how to fix it.

In testing, when I input a value such as #@^&((@*! the cout statement would always be: "The integer value entered is negative and a multiple of five." (See code attached for context.) However, all of those symbols are of the char data type, and although chars have an equivalent integer value, I specifically defined the input to only accept int type data.

It also seems to disregard char symbols mixed with integers (125!!%!125). The program will return inconsistent result with such inputs, even though I set a condition that would send an ERROR message if non-int types were input.

Edit: The code looks sloppy, but I can't get it to stay organized like in VS.

Edit 2: Can't get it to look proper with ATF formatting. Here is a link to download the small .cpp file. If anyone remembers that website were users can copy/paste text to a server for public viewing, let me know.
 
C++ has automatic conversions for those symbol into integers. If you cin "@" as an int it gives you the value -858993460 through this conversion. It's not smart enough to tell you that your input is not a number, so it converts it to one. At that point all of the logic you set up will work, but the input is garbage. Garbage in, garbage out, as the saying goes.

You should be validating your input. One way to do this is to input everything as a string and then convert the string into an integer value manually.
 
Originally posted by: RapidSnail
Originally posted by: Kytelandyour input. One way to do this is to input everything as a string and then convert the string into an integer value manually.

How do I do that?

There are many ways to accomplish this. Perhaps the easiest method to implement with your current code is stringstream. For example.

Another tactic for string conversion (or any type, really) is the use of templates. Here is an example. While this approach is overkill for your current assignment, it may become quite useful, especially if you have to convert many different strings to many different types.

 
I'm taking a beginning C# class and have just finished if/else and then loops. This would be easy to guard against in C# as all input is accepted as a string. A person would just have to check that the first character, in the string, is a number. Then convert that string to an integer.
I don't know if you can do that in C++. Read a string. Test to see if the first character is a number: 0 - 9. Convert your string to a number then do your calculations. A switch( I think it might be CASE in C++) ststement might be good for this.
 
Originally posted by: between
Originally posted by: Kyteland
You should be validating your input.

if it is an introductory course, where they are just getting started with conditionals and control flow, then it's probably too early to be worrying about validating input...

What if he writes a very basic program that lets a user input a zero and then uses it for a denominater? Now is a very good time to start thinking about data validation.

 
The metapoint of all these posts is that there are lots of ways to check for errors -- to either prevent them or to handle them. The source of the 'problem' is that everything in a computer system is a number. A collection of bits. Strings, characters, symbols, code, all numbers. Its up to the programmer to make sure that each number is handled correctly. When you use cin on a string, you implicitly assume the numbers that are stuffed into the string are null-terminated numeric characters. When they're not (e.g., #@!~^$), the assumption breaks, and so does the code. The various methods above are ways for validating the numeric character assumption -- but you should be aware of the more general problem: everything is a number, and at some level the interpretation of those numbers is often an assumption of a notion called 'type' that doesn't exist inside the computer -- it is an abstraction for the programmer.

As you progress in your programming career, you will do well to bear this point in mind, as huge classes of bugs are like this: Programs (and programmers) assume certain numbers will behave in certain ways, and when those behaviors are challenged, things break.
 
Back
Top