#### Re: Calculus and other shit math

Oh, okay. What are you studying? I did this stuff in high school and took a much more hardcore calculus course, but I did electrical engineering, where students are routinely filled with Calculus.

Studies show that learning exotic programming languages like Haskell, LISP/Scheme accelerates neckbeard & facial hair growth.

#### Re: Calculus and other shit math

Computer Science. I've taken Calc like twice and discrete math and some other bullshit so far idk. I probably should be done Calc I and II and probably be onto like, linear algebra or something but I'm pretty fucking bad at shit so w/e. Idgaf lmao.

What do you even mean by "more hardcore". Just because I was going over some old shit in this thread doesn't mean that's what we were supposed to know to pass, I was just making comments about stuff I didn't get the first time around. My Calc class had plenty of engineers and aeronautical people and bio people from Penn State and I'm pretty sure one or two ivy league schools in it, it wasn't watered down. Some of them had a worse go of it than I did. It was an accelerated summer class so it was a bit rushed, I guess, but we covered everything we did the first time I failed it, almost.

Fucking nerds I swear.

Plus it's not like I'm ever gonna do any fucking research programming anyway, or probably even be a programmer kek. I have no idea what I want to do but definitely not sit around doing gruelingly boring and hard math bullshit fuck shit stack. What the fuck am even doing with my life lol

#### Re: Calculus and other shit math

Graphics programming uses it, but again it's largely limited to geometric transformations, the coordinate system and then in R^3 spaces you'll be using quaternions and vectors for rotations and positions, respectively. Actually most of it is linear algebra and trigonometry. You do use cross and dot products a lot, but those are simple.

"Humanity Is Overrated" - Shrek

#### Re: Calculus and other shit math

"More hardcore" as in it included multivariable calculus, vector analysis and calculus of variations, none of which you'll need in order to program computers. They're useful to study things related to RF and high-power electrical machinery, so they weren't wasted on me (or at least so my teachers thought, little did they know my plan was to program computers all along). I certainly haven't used any of it after I finished university. I did use them in a programming context before I finished university, but I was writing simulation software. I did some RF-related work after that but nothing so important as to require this kind of stuff. My former colleagues who did go on to be real electrical engineers do use it sometimes.

V.R. is right, most of what you really want to know is in the courses that have to do with geometry, algebra and the various fields of discrete mathematics. Calculus deals with continuous quantities, which computers can't deal with in their primary form by their very nature (barring various attempts at symbolic calculation which, IMHO, yielded moderately useful results outside the field of mathematics). It's, uh, nice to know, but not terribly useful.

Sorry if it looked like I implied you were an idiot :D. I genuinely thought you were taking pre-university Calculus. Some of the things you mentioned are taught in high school in some (most?) parts of Europe. I guess I just did a reverse American tourist thing and assumed everything should be like Texas, except maybe sometimes without the sand.

Studies show that learning exotic programming languages like Haskell, LISP/Scheme accelerates neckbeard & facial hair growth.

#### Re: Calculus and other shit math

There was this one time that muh professor used calculus to solve a discrete math problem wat
Some algorithms use calculus as well... but I haven't used it ever hahrehreahrehf

sloth wrote:

Comfy does not provide challenge, challenge provides success, success provides happiness. Our world is not comfy, although we tried to make it so. Slaves of our own inventions, yada, yada. Not only on a technological level, also on a social and political level. Nothing more but apes. Apes with psychosomatic disorders.

#### Re: Calculus and other shit math

I mean I can see how calculus could be useful for some applications, I just doubt I'll ever be programming shit that will actually require me to do any of it. Pretty sure most shitty business web application nonsense whatever database fuckshit barely uses algebra, except in more specialized stuff. It's not like I'm gonna go work for a gaming company or at NASA or as a researcher, you know? Like I'm probably not going to graduate school.

I'm not good at math by any stretch of the imagination, but afaik the Calc at my Uni is pretty standard. I was supposed to have finished Calc a long time ago but I never did, so that's why I had to take it so late. I was gonna take Calc II and linear algebra or something but I don't know. We'll see about that. Calc is sort of kind of interesting sometimes, I guess, but it's also quite a bit of work ugh. I don't know, it's not too bad. I'm just lazy or something. Fucking optimization problems and all this.

I like to go back and work out fundamental stuff that I don't fully understand or have forgotten because it's hard for me to really get things unless I can kind of see the whole picture and how everything fits together or something, so that's probably why it looks like I was still doing the basics in this thread.

#### Re: Calculus and other shit math

loon_attic wrote:

There was this one time that muh professor used calculus to solve a discrete math problem wat
Some algorithms use calculus as well... but I haven't used it ever hahrehreahrehf

I heartily recommend this: http://longnow.org/essays/richard-feynm … n-machine/ (transcript below if, like me, you don't watch anything with the TED logo on it).

The really fun part:

By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. Feynman's router equations were in terms of variables representing continuous quantities such as "the average number of 1 bits in a message address." I was much more accustomed to seeing analysis in terms of inductive proof and case analysis than taking the derivative of "the number of 1's" with respect to time. Our discrete analysis said we needed seven buffers per chip; Feynman's equations suggested that we only needed five. We decided to play it safe and ignore Feynman.

The decision to ignore Feynman's analysis was made in September, but by next spring we were up against a wall. The chips that we had designed were slightly too big to manufacture and the only way to solve the problem was to cut the number of buffers per chip back to five. Since Feynman's equations claimed we could do this safely, his unconventional methods of analysis started looking better and better to us. We decided to go ahead and make the chips with the smaller number of buffers.

Fortunately, he was right. When we put together the chips the machine worked.

Studies show that learning exotic programming languages like Haskell, LISP/Scheme accelerates neckbeard & facial hair growth.