Retool, a development platform for business software, recently published the results of its State of AI survey. Over 1,500 people took part, all from the tech industry:...
Over half of all tech industry workers view AI as overrated::undefined
Meh. Roughly 90% of what I know about baking is from chatgpt. There just wasn’t a comparable resource. “Oh God the dough is too dry”, “can I sub in this fat for this fat and if so how?”, “if I change the bath do I have to change the score method?”.
It is like I have a professional baker I can just talk to whenever. I am sure as I get better at baking I will exceed it’s ability to help but I can’t deny that what I have accomplished now I could not have in the same timeframe without it.
So? Are you saying you disagree with the premise of the article because chatgpt taught you how to bake?
Professional tech work isn’t really relatable to baking at home.
Over half of all tech industry workers view AI as overrated::undefined
Not professional tech work. Really not sure what you want from me. I found it a useful tool and I am sorry it didn’t work out for you or your application.
You’re splitting hairs here I think it’s fair to make the statement that tech industry workers perform professional tech work. I mean it’s cool that you learned to bake but what makes you think this means you know what the skill requirements are for tech workers and how well chatgpt can cover for gaps in those skills? Your dismissive ‘meh’ says to me ‘yea but I learned how to bake with chatgpt so I disagree with this statement’.
I completely disagree with the premise that AI is overrated. It’s fucking underrated.
Sure, if you just blindly put input into ChatGPT and use the output it’s not that useful. It’s a tool. You need to know how to use a tool to get the most use out of it. If you know how to use it properly, then it’s fucking amazing. Anyone claiming that AI can just magically work and do anything is a fraud.
The truth is that anyone who can’t get practical use of it is probably ass at communicating with people. Pretend it’s another engineer who’s junior to you and your passing work off to them. If you can’t do that, then you probably shouldn’t ever progress beyond mid-level in your career.
ChatGPT has never worked well for me. Sure, it can tell you how to center a div, but for anything complex it just fails. ChatGPT is really only useful for elaborating on something. You can give it a well commented code snippet, ask it to add some simple feature to it, and it will sometimes give a correct answer. For coding, it has the same level of experience as a horde of highschool CS students.
I’m not going to dox myself by giving proof, but I have used it to create novel, complex frameworks from scratch. Sure, it starts with my idea and guidance. But nearly 100% of the code and documentation is AI generated. Nobody has been able to tell the difference when I show it to them. We’re talking engineers who became leads years ago.
If that’s you’re experience, then it speaks volumes about your ability to communicate. If you can lead a team and communicate requirements, then you can use AI.
Sorry that my personal experience with ChatGPT is ‘wrong.’ if you feel the need to insult everyone who disagrees with you, that seems like a better indication of your ability to communicate than mine. Furthermore, I think we’re talking about different levels of novelty. You haven’t told me the exact nature of the framework you developed, but the things I’ve tried to use ChatGPT for never turn out too well. I do a lot of ML research, and ChatGPT simply doesn’t have the flexibility to help. I was implementing a hierarchical multiscale LSTM, and no matter what I tried ChatGPT kept getting mixed up and implementing more popular models. ChatGPT, due to the way it learns, can only reliably interpolate between the excerpts of text it’s been trained on. So I don’t doubt ChatGPT was useful for designing your framework, since it is likely similar to other existing frameworks, but for my needs it simply does not work.
if you feel the need to insult everyone who disagrees with you, that seems like a better indication of your ability to communicate than mine
I have spent most of my professional career trying to get engineers to unionize and fight for better wages/benefits. The vast majority are apathetic and passive. Now we have a brand new technology that won’t make the profession obsolete, but will make the need for their labor obsolete. I’m personally in a safe spot. I have the portfolio and network to ensure I will have opportunities to work no matter what. But how many engineers simply do not have that? MBAs will raze this profession to the ground for short-term profits. They’ll destroy every pipeline that takes in entry-level engineers and gives them a chance to grow.
As far as I know, you’re just another engineer who doesn’t see what’s coming down the road. Pleasant speech hasn’t worked to get people to wake up over the past two years, and I’m running out of patience like you’re all running out of time.
since it is likely similar to other existing frameworks
And that’s the problem. You claim to be in ML research, and you don’t even know what your own tools and field are capable of. I haven’t just made frameworks that are “copies” of a framework but in another language, or a combination of existing frameworks. GPT-3.5 and GPT-4 are capable of novel outputs when given sufficient guidance. I’ve gotten it to fill in architecture gaps and implement complex, recursive functions. I don’t believe for a second that it’s “self aware”. But it’s far more capable than even you realize.
I’m willing to bet that if I were in your shoes I could get at least some usable output from it for your use case.
Let’s play a little game, then. We bothe give each other descriptions of the projects we made, and we try to make the project based on what we can get out of ChatGPT? We send each other the chat log after a week or something. I’ll start: the hierarchical multiscale LSTM is a stacked LSTM where the layer below returns a boundary state which will cause the layer above it to update, if it’s true. the final layer is another LSTM that takes the hidden state from every layer, and returns a final hidden state as an embedding of the whole input sequence.
I can’t do this myself, because that would break OpenAI’s terms of service, but if you make a model that won’t develop I to anything, that’s fine. Now, what does your framework do?
Cool. I’ll get back to you with a DM after the holidays. I’m not going to devote a week to proving a random internet stranger wrong, but I’ll at least give you a few hours.
I don’t know what that term actually is supposed to mean do they mean programming, do they mean system architecture, systems management, cyber security, what?
The term is so broad as to be meaningless, so I don’t think you can necessarily say that it’s any harder than baking, because we don’t know what an earth we’re talking about.
Professional tech work at home is professional tech work. I think to anyone who actually have careers in technology wouldn’t see a distinction here. Programming is not the same as systems architecture, systems management etc. Programming is simply one of the tools you use as a software engineer. I do not think it’s too broad to be meaningless and I think comparing learning to bake to software engineering is reductive and shows a lack of understanding about the requirements of the field.
Buy a fucking book about baking. Not a fancy colour print recipe book, but a textbook, you know the kind with a chapter or two about dough chemistry.
If nothing else, as a beginner you have no idea which questions to ask and ChatGPT is never going to give you a dissertation about fundamental knowledge and principles. And you have absolutely no way to tell whether ChatGPT is spouting random nonsense.
I bought a fucking book more than one and it wasn’t as good. A fucking book can’t examine a picture and tell me what went wrong, the fucking book I bought didn’t have subsistion charts, the fucking book I bought didn’t respond to cntrl+F
Did you get this comment response or should I have faxed it to you?
If you need ChatGPT to analyse a picture for you you lack very basic knowledge about baking. It can’t smell, it can’t touch, it can’t hear, and it has never fucking ever baked. It has never taken a wet and sticky dough, tensioned it, and, voila, suddenly it’s a pleasure to handle.
And substitution charts? For what? If you understood the underlying dough chemistry you wouldn’t be asking in the first place. As said: You lack the basic knowledge to know what questions to ask, and once you have that knowledge your questions will only be answered by experiment.
I’m a capable home baker, no more no less. You’ll learn more and faster learning from sources which actually know what they’re talking about is all I’m saying. Want to spend the next 10 years dabbling around still not learning anything of substance, go ahead, be my guest, stick with ChatGPT.
I can make you a pumpkin cake in a pan on a stovetop if needs be. Which is going to turn out better than in a rice cooker because steaming a cake isn’t really a stellar idea (though bread is often baked with steam, different reasons), and even if you get things working rice cookers don’t produce temperatures which cause browning, they shut down at a baseplate temperature of like 105C as that means that all water has evaporated. 140-165C is necessary in case you’re wondering.
Did you really think about that question yourself or did you go with the rice cooker because ChatGPT wasn’t smart enough to realise that nope, that’s a bad idea, and chances are if you have a rice cooker you also have something to put a pan on. To me it seems like you’re outsourcing thinking to a smart-sounding idiot, instead spending all your intelligence on justifying that decision post-hoc.
Meh. Roughly 90% of what I know about baking is from chatgpt. There just wasn’t a comparable resource. “Oh God the dough is too dry”, “can I sub in this fat for this fat and if so how?”, “if I change the bath do I have to change the score method?”.
It is like I have a professional baker I can just talk to whenever. I am sure as I get better at baking I will exceed it’s ability to help but I can’t deny that what I have accomplished now I could not have in the same timeframe without it.
So? Are you saying you disagree with the premise of the article because chatgpt taught you how to bake? Professional tech work isn’t really relatable to baking at home.
I believe the central premise was
Not professional tech work. Really not sure what you want from me. I found it a useful tool and I am sorry it didn’t work out for you or your application.
You’re splitting hairs here I think it’s fair to make the statement that tech industry workers perform professional tech work. I mean it’s cool that you learned to bake but what makes you think this means you know what the skill requirements are for tech workers and how well chatgpt can cover for gaps in those skills? Your dismissive ‘meh’ says to me ‘yea but I learned how to bake with chatgpt so I disagree with this statement’.
Ok fine you win. It is completely useless. Please go win arguments elsewhere.
I completely disagree with the premise that AI is overrated. It’s fucking underrated.
Sure, if you just blindly put input into ChatGPT and use the output it’s not that useful. It’s a tool. You need to know how to use a tool to get the most use out of it. If you know how to use it properly, then it’s fucking amazing. Anyone claiming that AI can just magically work and do anything is a fraud.
The truth is that anyone who can’t get practical use of it is probably ass at communicating with people. Pretend it’s another engineer who’s junior to you and your passing work off to them. If you can’t do that, then you probably shouldn’t ever progress beyond mid-level in your career.
ChatGPT has never worked well for me. Sure, it can tell you how to center a div, but for anything complex it just fails. ChatGPT is really only useful for elaborating on something. You can give it a well commented code snippet, ask it to add some simple feature to it, and it will sometimes give a correct answer. For coding, it has the same level of experience as a horde of highschool CS students.
You could not possibly be more wrong.
I’m not going to dox myself by giving proof, but I have used it to create novel, complex frameworks from scratch. Sure, it starts with my idea and guidance. But nearly 100% of the code and documentation is AI generated. Nobody has been able to tell the difference when I show it to them. We’re talking engineers who became leads years ago.
If that’s you’re experience, then it speaks volumes about your ability to communicate. If you can lead a team and communicate requirements, then you can use AI.
Sorry that my personal experience with ChatGPT is ‘wrong.’ if you feel the need to insult everyone who disagrees with you, that seems like a better indication of your ability to communicate than mine. Furthermore, I think we’re talking about different levels of novelty. You haven’t told me the exact nature of the framework you developed, but the things I’ve tried to use ChatGPT for never turn out too well. I do a lot of ML research, and ChatGPT simply doesn’t have the flexibility to help. I was implementing a hierarchical multiscale LSTM, and no matter what I tried ChatGPT kept getting mixed up and implementing more popular models. ChatGPT, due to the way it learns, can only reliably interpolate between the excerpts of text it’s been trained on. So I don’t doubt ChatGPT was useful for designing your framework, since it is likely similar to other existing frameworks, but for my needs it simply does not work.
I have spent most of my professional career trying to get engineers to unionize and fight for better wages/benefits. The vast majority are apathetic and passive. Now we have a brand new technology that won’t make the profession obsolete, but will make the need for their labor obsolete. I’m personally in a safe spot. I have the portfolio and network to ensure I will have opportunities to work no matter what. But how many engineers simply do not have that? MBAs will raze this profession to the ground for short-term profits. They’ll destroy every pipeline that takes in entry-level engineers and gives them a chance to grow.
As far as I know, you’re just another engineer who doesn’t see what’s coming down the road. Pleasant speech hasn’t worked to get people to wake up over the past two years, and I’m running out of patience like you’re all running out of time.
And that’s the problem. You claim to be in ML research, and you don’t even know what your own tools and field are capable of. I haven’t just made frameworks that are “copies” of a framework but in another language, or a combination of existing frameworks. GPT-3.5 and GPT-4 are capable of novel outputs when given sufficient guidance. I’ve gotten it to fill in architecture gaps and implement complex, recursive functions. I don’t believe for a second that it’s “self aware”. But it’s far more capable than even you realize.
I’m willing to bet that if I were in your shoes I could get at least some usable output from it for your use case.
You talk a confident, condescending game for someone who cant substatiate any of their claims lol.
EDIT: Dear lord after looking through your comments i regret enabling you. Fuck isreal, free palestine, AI is overhyped…
Okay Tankie.
Let’s play a little game, then. We bothe give each other descriptions of the projects we made, and we try to make the project based on what we can get out of ChatGPT? We send each other the chat log after a week or something. I’ll start: the hierarchical multiscale LSTM is a stacked LSTM where the layer below returns a boundary state which will cause the layer above it to update, if it’s true. the final layer is another LSTM that takes the hidden state from every layer, and returns a final hidden state as an embedding of the whole input sequence.
I can’t do this myself, because that would break OpenAI’s terms of service, but if you make a model that won’t develop I to anything, that’s fine. Now, what does your framework do?
Here’s the paper I referenced while implementing it: https://arxiv.org/abs/1807.03595
Cool. I’ll get back to you with a DM after the holidays. I’m not going to devote a week to proving a random internet stranger wrong, but I’ll at least give you a few hours.
What’s professional tech work when it’s at home?
I don’t know what that term actually is supposed to mean do they mean programming, do they mean system architecture, systems management, cyber security, what?
The term is so broad as to be meaningless, so I don’t think you can necessarily say that it’s any harder than baking, because we don’t know what an earth we’re talking about.
Professional tech work at home is professional tech work. I think to anyone who actually have careers in technology wouldn’t see a distinction here. Programming is not the same as systems architecture, systems management etc. Programming is simply one of the tools you use as a software engineer. I do not think it’s too broad to be meaningless and I think comparing learning to bake to software engineering is reductive and shows a lack of understanding about the requirements of the field.
Where does the article mention programming?
Buy a fucking book about baking. Not a fancy colour print recipe book, but a textbook, you know the kind with a chapter or two about dough chemistry.
If nothing else, as a beginner you have no idea which questions to ask and ChatGPT is never going to give you a dissertation about fundamental knowledge and principles. And you have absolutely no way to tell whether ChatGPT is spouting random nonsense.
I bought a fucking book more than one and it wasn’t as good. A fucking book can’t examine a picture and tell me what went wrong, the fucking book I bought didn’t have subsistion charts, the fucking book I bought didn’t respond to cntrl+F
Did you get this comment response or should I have faxed it to you?
If you need ChatGPT to analyse a picture for you you lack very basic knowledge about baking. It can’t smell, it can’t touch, it can’t hear, and it has never fucking ever baked. It has never taken a wet and sticky dough, tensioned it, and, voila, suddenly it’s a pleasure to handle.
And substitution charts? For what? If you understood the underlying dough chemistry you wouldn’t be asking in the first place. As said: You lack the basic knowledge to know what questions to ask, and once you have that knowledge your questions will only be answered by experiment.
“If you studied the topic for multiple years of your life AI is useless” Great, thank you, I didn’t know that
I’m a capable home baker, no more no less. You’ll learn more and faster learning from sources which actually know what they’re talking about is all I’m saying. Want to spend the next 10 years dabbling around still not learning anything of substance, go ahead, be my guest, stick with ChatGPT.
deleted by creator
I can make you a pumpkin cake in a pan on a stovetop if needs be. Which is going to turn out better than in a rice cooker because steaming a cake isn’t really a stellar idea (though bread is often baked with steam, different reasons), and even if you get things working rice cookers don’t produce temperatures which cause browning, they shut down at a baseplate temperature of like 105C as that means that all water has evaporated. 140-165C is necessary in case you’re wondering.
Did you really think about that question yourself or did you go with the rice cooker because ChatGPT wasn’t smart enough to realise that nope, that’s a bad idea, and chances are if you have a rice cooker you also have something to put a pan on. To me it seems like you’re outsourcing thinking to a smart-sounding idiot, instead spending all your intelligence on justifying that decision post-hoc.
deleted by creator