Anti Bitcoin (ANTI) - Live streaming prices and market cap
Bitcoin and Gold – Is it the right comparison? CoinCodeCap
Vanity mining hardware comparison - Bitcoin Wiki
Chasing Google: Bitcoin Mining Hardware Comparison List
So I finally gave Honeyminer a try. (my personal semi-review)
This review was last updated 11-30-18 When I first was interested in trying this program I couldn't find anything about it. it seems a lot of people were too scared to try it since their is like no information about it other then from the web page itself. to be honest I was a bit scared to try it. I've tried many other software of this kind, on a "test" machine I'm not afraid to lose on a secondary network and router... incase its a scam or gonna give me a virus and I suggest anyone installing mining software do the same as a rule of thumb. please keep in mind the software is still relatively new and they are working to improve it still. They seem to be hiring as well if your interested in helping them grow by working for them look near the bottom for their contact e-mail. ____________________________________________________________________________________________________ This review is for the windows version of Honyminer Because its still relatively new I knew could go one of two ways "sacm software" like most every mobile mining app or even quite a few desktop ones - Or legit. I'm glad to say after using it for a month it seems legit. I was able to withdraw from it no problem. If your system is really crappy It might not work that well on your computer or mining rig. There are no ads and the program doesn't seem to disrupt any day to day activity at least not on my main system, however you can of course expect increased heat production of your system as with any mining software, adequate cooling is important in mining. Anyways Honyminer is as close to an easy one click mining software as I have come. they seem to be making a "pro" version too for more hardcore miners. They do take a fee which is to be expected *look near the bottom for fee information\* but that fee goes down significantly if you have multiple GPU's mining.. The good thing about it for me was it let me kind of set my rig to "autopilot" so to speak. If you wish to see the H/s numbers in real time, go to you settings and view the "expert logs" which will also tell what coin is being mined at the time ____________________________________________________________________________________________________________ Pros
Withdrawals (I know I shouldn't have to say this but some mining software is a scam and wont withdrawal anything. This was tested with coinbase only so far and it went through with no issue.
(new) If you go to your dashboard > Activity on their site you can see a list of all GPUs/CPUS and computers that are minding with information about their temperature, the coin they are currently mining, number of cores, and the potential 24 hour revenue for each. This is just like the "see full activity" feature in the software itself but you can check it from anywhere
(new) You can set the app to only mine via GPU or CPU if you so choose in settings.
(new) a miner console has been added which should make some of the more experienced miners a little happier.
when you click "see full history" it takes you to their webpage where you can see all the transactions (where your Satoshis came from) and are labeled according to how they were acquired (Mining Credit, Mining Bonus, Referral Mining Credit, Referral Mining Credit Tier 2, and Bonus (meaning other kinds of bonuses like from leveling up) They are all time stamped and have an ID number
Easy to use/easy to instal I literally had no trouble setting it up or installing it. it was quick and easy
GPU and CPU mining
Mines many different types of cryptocurrencies depending on what's more profitable at the time (autopilot)
withdrawal as BTC or (it says in the withdrawl section "coming soon ETH, LTC, " but I dont think its a priority yet and Im not sure if they scrapped the idea of USD withdrawals all together or not but I don't see it there)
Idling option: for example soon as you use your mouse or type it will stop mining.
appears in the "task manager" which Is another one I should not have to say but you'd be surprised how many fake mining software will not show up there or will be listed with a inconspicuous logo or disguised as a system process.
Works in system tray if you'd like to multitask and your system is up for it.
can be set to mine soon as you boot-up
Frequent mining "bonuses" you will probably see a lot of them on your transaction history.
A "level-up" system which I've not seen before that pays you extra Satoshis for reaching the next "level" think like exp on video games, you get rewarded for leveling up and the higher your level the higher the bonus generally. the "next bonus" will update the closer you get to leveling up.
You can use multiple computers/rigs on the same account and see them all from any system with the appinstalled.
2 factor authentication which IMO is a must for anything like this, set that up on their webpage asap.
earnings log which you can acass from the website manually or clicking "see full history" on the app
can see your earnings as USD or as BTC.
shows you a quick earning comparison between today, and the previous two days. (if you don't see it update the software)
"pro" version currently in the works which I look forward to trying.
1st and 2nt tier referral rewards.
referral profits DON'T come out of the person you referred profits they come out of Stax Digital's profits so there is no guilt for referring people to this product. I've seen or heard of referral programs that actually punish the referred folks by taking a commission of what the person would have made in addition to taking their normal fee... in this case it comes out of the fee that Honyminer already takes from all users and not anything extra as far as I know.
referee's also get rewarded like if you were to sign up from my links you would get 1000 free Satoshis just for installing the app. (if you prefer to sign up directly that's fine too but there is no signing bonus if you go that route (unless you use someone else's referral link) as far as I'm aware. Whatever works for you really.
team is open to suggestions/feedback, friendly, and respectful.
code is audited (at least at least that's what they say)
you can add multiple wallets on their webpage. and delete them at will.. another one I should not have to say but still even today some places will not give you that basic functionality.
able to see what type of coin each CUP/GPU is mining at the time. (check out the options and "see full activity"
Proandor con (depending on how you look at it)
uninstalling gets rid of most of the components that enable it to be used, but seems to save some of the logs and some other files (but I was able are to search for and remove em in file explorer. many programs of any kind do that always so it's not that big of a con to me but I can see how it may bother some.
you are not asked to create a password, they create one for you but you can change it once you have logged in if you wish from their website. This can be looked at as a good thing to some or a bad thing to others for various reasons. If this is no longer the case please let me know.
when clicking on the app to see your full history of transactions it will take you to their webpage and make you log in again sometimes. this is a good or bad things depending on how you look at it I suppose. I personally prefer having to log in again.
no graphs, +/- earnings overtime comparisons. but it does have some logs to see what your mining in the expert logs section but not as much information as I would like. (miners console was added that also has more detailed info) but im hopeful for the future. Every mining software that was any good started somewhere.
installer was still packed with the first version when I downloaded it onto another setup so yea you need to update it right off the bat. It doesn't take very long, but I like it when software packs installers with the latest version (I don't know if this has changed but if you downloaded it and its already the latest version let me know)
may have trouble initiate some GPU's although I cant possibly test for every kind I have put the ones that didn't work for me below and will update it also if anyone else tells me it doesn't work with a certain setup.
_________________________________________________________________________________________________ COMPATIBILITY: (sorry it keeps adding asterisks to the card model for no reason) WORKED ON: every nvidia card tested so far with card models dating back from 20014 to now.. Worked on some surprising low end and or old CPU and GPUs. like the AMD Radeon R9 380 card in addition to a AMD Athlon II X3 450 Processor and it mines just fine.. of course that processor doesn't make much on its own lol.. but thats an extra 2 or 3 cents per day by itself. I've also tested it with an i3,i2Most AMD cards worked but I ran into issues with a few so maybe it's easier for me to just tell you what did not work. DID NOT WORK ON: --- any of the AMD ATI Radeon HD 4250's tested so far (2) that particular card It didn't work at all for mining like never enabled the gpu but the cpu on that machine did work however it would generate an "error" on start up but otherwise did not disrupt the mining on that system except if I turned on idle earning mode, I would get a bunch of errors as it was trying to access the GPU. we need the functionality to enable or disable hardware individually I think. (errors or no errors it just seems like a good thing to have.) OR a system that had both a AMD Radeon R7 Graphics and a AMD A8-7650K Radeon R7, (4C+6G) which surprised me considering some of the things that did work lol... but I think it might just might be that one system, but either way can't vouch that it will work. That system was pre-built and wont allow the parts to be changed or easily removed to be worth the effort since I have to use it for other things so unfortunately I can't test these on another mainboard at least not with wasting some time, money and patients that Id rather dedicate elsewhere for now. I had some issues using one RX Vega 56 card but i think it's was just that card because another one did work just fine.________________________________________________________________________ FEESW/comparison to nicehash I'm not sure if this post will be helpful to anyone looking into this software or anyone whos looking to try a different mining software but if it dose great. -- nicehash charges the following fees as far as "selling/mining" or withdrawing. Payouts for balances less than 0.1 to external wallet 5% Payouts for balances greater than or equal to 0.1 BTC to external wallet 3% Payouts for balances greater than or equal to 0.001 BTC to NiceHash wallet 2% Withdrawal fees from NiceHash wallet Withdrawals from NiceHash wallet are subjected to the withdrawal fee, which depends on the withdrawn amount and withdrawal option. WITHDRAWAL OPTION AMOUNT TO WITHDRAW FEE Any BTC wallet From 0.002 (min) to 0.05 BTC 0.0001 BTC Any BTC wallet More than 0.05 BTC 0.2% of withdrawn amount Coinbase More than 0.001 BTC FREE - No fee. but they also say Minimum Coinbase withdrawal limit is adjusted dynamically according to the API overload._____________________________________________________________________________ honyminer fees are based on number of GPU's working. 8% for 1 GPU or for 2 GPUs or more the fee is 2.5%. The only withdrawal fee is the standard BTC transaction fee that bitcoin charges and it doesn't go to honyminer. When they add the other withdrawal functions that fee cam be avoided I suppose. _________________________ Earnings: in comparison to nicehash Update: sometimes software / test networks will give a view that can be off + or - a few percent compared to actual. A lot of different things can affect your earnings including where you are located in the world, I'm not sure how many of you uses more than one mining software day to day , ISP issues, crypto price fluctuation, updates to fee's, and inaccuracies in test software/networks can affect results. but I go back and forth between different ones from time to time and I think that's good practice to keep options open. I notice that honey miner seems to do better for me at night-time and early morning/afternoon is when it has the most trouble raking in the crypto's That said I've been trying to test to see how this compares to nice hash earnings, with two of my buddies. So this is an average between the 3 of our profits vs loss compared to nice hash, I'm using a two 10 GPU/ 3 cpu setups, while one of my buddies is using two 1 gpu, 2 cpu setups and the other is using two 30 gpu mini farm's. We each have 2 networks each located relatively close by *less than .5 mile the furthest one* one with honyminer running and the other with nice hash and we are looking over 24 hour periods When all three of us have the results for one day, we average our results together. In all we will be looking over a 14 day period. UPDATE: the results below were done well long before the latest update to the software so I do not know if they have changed, Id have to do another round or perhaps some from the community could give me their results and save me a bit of work. I'm not sure when Id have the time to dig into it again. Sorry that it took me so long before I could get on here to post the results of the last few days of the tests.
Day one: -5%
Day Two: +10
Day Three: +1%
Day Four: -6%
Day Five: -2%
Day Six: +11%
Day seven: +2%
Day eight: +1%
Day Nine: -5%
Day Ten: -11%
Day eleven: +8%
Day Twelve: +1%
Day Thirteen: +1%
Day Fourteen: -1%
Seem to be a bit smaller then nicehash at times and higher at other times. it seems to for me at least payquicker and it gets deposited in my nicehash account sooner than I expected. hopefully when they let up pick which coin to mine on our own it may help somewhat, and any of you who want to move smaller volume will probably benefit when they add the functionality to withdraw other coin/usd. anyways when their autopilot system works it works great but when it doesn't it's just "okay" for lack of a better word... _____________________________________________________ Contact: they have a contact us part on their webpage and they also have a reddit page which I was made aware of from contacting them https://www.reddit.com/HoneyMine Careers: If anyone is interested in working for them the job listings at the time of this typing were for Senior Java Developer(s) and Customer Service Representative(s) the email listed is [[email protected]](mailto:[email protected]). id suggest you check their site for the requirements I just added this part to the review as a courtesy if anyone's interested its not meant to be a focus of it. But I know we have some really talented people on reddit who care about the crypto world passionately so id rather give honyminer a chance to have some of those sort on their team since it might help improve the software faster for the end users.. if that makes sense. _________________________________________________________ UPDATE: If a question reminds me I left out something I think should have mentioned Ill try to add it here so ppl don't have to scroll all over the place.. I don't write many reviews (for anything) so I don't know if this one was any good or not but I hope it was okay.. and I'm still a new reddit user relatively. I just wanted to make this review mainly because there is next to no information on honyminer when I looked for it and maybe it can help anyone whos interested in it. browolf2asked Is it basically like nicehash then? : A: In a way, its like nice hash that its cloud based, but you get paid not just when your pool completes an order. there are no "buyers" only "sellers" if you look at it that way...I hope I'm wording this the right way.. It's just straight up mining and they take their fee but compared to nicehash the fees for "mining" are different karl0525asked: do you know if we can contact the honeyminer dev team and see if they will communicate here on Reddit. Might give them some good ideas what us miners are looking for? Worth a try maybe? Thanks: A: I submitted a question to their "contact us" part of their webpage and I got a reply from them, this is the message I received below: Thank you for writing in and for your interest in Honeyminer. We always welcome feedback and suggestions from our users. We are currently planning on expanding our online and social media presence. Please check our our Reddit page: https://www.reddit.com/HoneyMine
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
https://seekingalpha.com/article/4152240-amds-growing-cpu-advantage-intel?page=1 AMD's Growing CPU Advantage Over Intel Mar. 1.18 | About: Advanced Micro (AMD) Raymond Caron, Ph.D. Tech, solar, natural resources, energy (315 followers) Summary AMD's past and economic hazards. AMD's Current market conditions. AMD Zen CPU advantage over Intel. AMD is primarily a CPU fabrication company with much experience and a great history in that respect. They hold patents for 64-bit processing, as well as ARM based processing patents, and GPU architecture patents. AMD built a name for itself in the mid-to-late 90’s when they introduced the K-series CPU’s to good reviews followed by the Athlon series in ‘99. AMD was profitable, they bought the companies NexGen, Alchemy Semiconductor, and ATI. Past Economic Hazards If AMD has such a great history, then what happened? Before I go over the technical advantage that AMD has over Intel, it’s worth looking to see how AMD failed in the past, and to see if those hazards still present a risk to AMD. As for investment purposes we’re more interested in AMD’s turning a profit. AMD suffered from intermittent CPU fabrication problems, and was also the victim of sustained anti-competitive behaviour from Intel who interfered with AMD’s attempts to sell its CPU’s to the market through Sony, Hitachi, Toshiba, Fujitsu, NEC, Dell, Gateway, HP, Acer, and Lenovo. Intel was investigated and/or fined by multiple countries including Japan, Korea, USA, and EU. These hazard needs to be examined to see if history will repeat itself. There have been some rather large changes in the market since then. 1) The EU has shown they are not averse to leveling large fines, and Intel is still fighting the guilty verdict from the last EU fine levied against them; they’ve already lost one appeal. It’s conceivable to expect that the EU, and other countries, would prosecute Intel again. This is compounded by the recent security problems with Intel CPU’s and the fact that Intel sold these CPU’s under false advertising as secure when Intel knew they were not. Here are some of the largest fines dished out by the EU 2) The Internet has evolved from Web 1.0 to 2.0. Consumers are increasing their online presence each year. This reduces the clout that Intel can wield over the market as AMD can more easily sell to consumers through smaller Internet based companies. 3) Traditional distributors (HP, Dell, Lenovo, etc.) are struggling. All of these companies have had recent issues with declining revenue due to Internet competition, and ARM competition. These companies are struggling for sales and this reduces the clout that Intel has over them, as Intel is no longer able to ensure their future. It no longer pays to be in the club. These points are summarized in the graph below, from Statista, which shows “ODM Direct” sales and “other sales” increasing their market share from 2009 to Q3 2017. 4) AMD spun off Global Foundries as a separate company. AMD has a fabrication agreement with Global Foundries, but is also free to fabricate at another foundry such as TSMC, where AMD has recently announced they will be printing Vega at 7nm. 5) Global Foundries developed the capability to fabricate at 16nm, 14nm, and 12nm alongside Samsung, and IBM, and bought the process from IBM to fabricate at 7nm. These three companies have been cooperating to develop new fabrication nodes. 6) The computer market has grown much larger since the mid-90’s – 2006 when AMD last had a significant tangible advantage over Intel, as computer sales rose steadily until 2011 before starting a slow decline, see Statista graph below. The decline corresponds directly to the loss of competition in the marketplace between AMD and Intel, when AMD released the Bulldozer CPU in 2011. Tablets also became available starting in 2010 and contributed to the fall in computer sales which started falling in 2012. It’s important to note that computer shipments did not fall in 2017, they remained static, and AMD’s GPU market share rose in Q4 2017 at the expense of Nvidia and Intel. 7) In terms of fabrication, AMD has access to 7nm on Global Foundries as well as through TSMC. It’s unlikely that AMD will experience CPU fabrication problems in the future. This is something of a reversal of fortunes as Intel is now experiencing issues with its 10nm fabrication facilities which are behind schedule by more than 2 years, and maybe longer. It would be costly for Intel to use another foundry to print their CPU’s due to the overhead that their current foundries have on their bottom line. If Intel is unable to get the 10nm process working, they’re going to have difficulty competing with AMD. AMD: Current market conditions In 2011 AMD released its Bulldozer line of CPU’s to poor reviews and was relegated to selling on the discount market where sales margins are low. Since that time AMD’s profits have been largely determined by the performance of its GPU and Semi-Custom business. Analysts have become accustomed to looking at AMD’s revenue from a GPU perspective, which isn’t currently being seen in a positive light due to the relation between AMD GPU’s and cryptocurrency mining. The market views cryptocurrency as further risk to AMD. When Bitcoin was introduced it was also mined with GPU’s. When the currency switched to ASIC circuits (a basic inexpensive and simple circuit) for increased profitability (ASIC’s are cheaper because they’re simple), the GPU’s purchased for mining were resold on the market and ended up competing with and hurting new AMD GPU sales. There is also perceived risk to AMD from Nvidia which has favorable reviews for its Pascal GPU offerings. While AMD has been selling GPU’s they haven’t increased GPU supply due to cryptocurrency demand, while Nvidia has. This resulted in a very high cost for AMD GPU’s relative to Nvidia’s. There are strategic reasons for AMD’s current position: 1) While the AMD GPU’s are profitable and greatly desired for cryptocurrency mining, AMD’s market access is through 3rd party resellers whom enjoy the revenue from marked-up GPU sales. AMD most likely makes lower margins on GPU sales relative to the Zen CPU sales due to higher fabrication costs associated with the fabrication of larger size dies and the corresponding lower yield. For reference I’ve included the size of AMD’s and Nvidia’s GPU’s as well as AMD’s Ryzen CPU and Intel’s Coffee lake 8th generation CPU. This suggests that if AMD had to pick and choose between products, they’d focus on Zen due higher yield and revenue from sales and an increase in margin. 2) If AMD maintained historical levels of GPU production in the face of cryptocurrency demand, while increasing production for Zen products, they would maximize potential income for highest margin products (EPYC), while reducing future vulnerability to second-hand GPU sales being resold on the market. 3) AMD was burned in the past from second hand GPU’s and want to avoid repeating that experience. AMD stated several times that the cryptocurrency boom was not factored into forward looking statements, meaning they haven’t produced more GPU’s to expect more GPU sales. In contrast, Nvidia increased its production of GPU’s due to cryptocurrency demand, as AMD did in the past. Since their Pascal GPU has entered its 2nd year on the market and is capable of running video games for years to come (1080p and 4k gaming), Nvidia will be entering a position where they will be competing directly with older GPU’s used for mining, that are as capable as the cards Nvidia is currently selling. Second-hand GPU’s from mining are known to function very well, with only a need to replace the fan. This is because semiconductors work best in a steady state, as opposed to being turned on and off, so it will endure less wear when used 24/7. The market is also pessimistic regarding AMD’s P/E ratio. The market is accustomed to evaluating stocks using the P/E ratio. This statistical test is not actually accurate in evaluating new companies, or companies going into or coming out of bankruptcy. It is more accurate in evaluating companies that have a consistent business operating trend over time. “Similarly, a company with very low earnings now may command a very high P/E ratio even though it isn’t necessarily overvalued. The company may have just IPO’d and growth expectations are very high, or expectations remain high since the company dominates the technology in its space.” P/E Ratio: Problems With The P/E I regard the pessimism surrounding AMD stock due to GPU’s and past history as a positive trait, because the threat is minor. While AMD is experiencing competitive problems with its GPU’s in gaming AMD holds an advantage in Blockchain processing which stands to be a larger and more lucrative market. I also believe that AMD’s progress with Zen, particularly with EPYC and the recent Meltdown related security and performance issues with all Intel CPU offerings far outweigh any GPU turbulence. This turns the pessimism surrounding AMD regarding its GPU’s into a stock benefit. 1) A pessimistic group prevents the stock from becoming a bubble. -It provides a counter argument against hype relating to product launches that are not proven by earnings. Which is unfortunately a historical trend for AMD as they have had difficulty selling server CPU’s, and consumer CPU’s in the past due to market interference by Intel. 2) It creates predictable daily, weekly, monthly, quarterly fluctuations in the stock price that can be used, to generate income. 3) Due to recent product launches and market conditions (Zen architecture advantage, 12nm node launching, Meltdown performance flaw affecting all Intel CPU’s, Intel’s problems with 10nm) and the fact that AMD is once again selling a competitive product, AMD is making more money each quarter. Therefore the base price of AMD’s stock will rise with earnings, as we’re seeing. This is also a form of investment security, where perceived losses are returned over time, due to a stock that is in a long-term upward trajectory due to new products reaching a responsive market. 4) AMD remains a cheap stock. While it’s volatile it’s stuck in a long-term upward trend due to market conditions and new product launches. An investor can buy more stock (with a limited budget) to maximize earnings. This is advantage also means that the stock is more easily manipulated, as seen during the Q3 2017 ER. 5) The pessimism is unfounded. The cryptocurrency craze hasn’t died, it increased – fell – and recovered. The second hand market did not see an influx of mining GPU’s as mining remains profitable. 6) Blockchain is an emerging market, that will eclipse the gaming market in size due to the wide breath of applications across various industries. Vega is a highly desired product for Blockchain applications as AMD has retained a processing and performance advantage over Nvidia. There are more and rapidly growing applications for Blockchain every day, all (or most) of which will require GPU’s. For instance Microsoft, The Golem supercomputer, IBM, HP, Oracle, Red Hat, and others. Long-term upwards trend AMD is at the beginning of a long-term upward trend supported by a comprehensive and competitive product portfolio that is still being delivered to the market, AMD referred to this as product ramping. AMD’s most effective products with Zen is EPYC, and the Raven Ridge APU. EPYC entered the market in mid-December and was completely sold out by mid-January, but has since been restocked. Intel remains uncompetitive in that industry as their CPU offerings are retarded by a 40% performance flaw due to Meltdown patches. Server CPU sales command the highest margins for both Intel and AMD. The AMD Raven Ridge APU was recently released to excellent reviews. The APU is significant due to high GPU prices driven buy cryptocurrency, and the fact that the APU is a CPU/GPU hybrid which has the performance to play games available today at 1080p. The APU also supports the Vulcan API, which can call upon multiple GPU’s to increase performance, so a system can be upgraded with an AMD or Nvidia GPU that supports Vulcan API at a later date for increased performance for those games or workloads that been programmed to support it. Or the APU can be replaced when the prices of GPU’s fall. AMD also stands to benefit as Intel confirmed that their new 10 nm fabrication node is behind in technical capability relative to the Samsung, TSMC, and Global Foundries 7 nm fabrication process. This brings into questions Intel’s competitiveness in 2019 and beyond. Take-Away • AMD was uncompetitive with respect to CPU’s from 2011 to 2017 • When AMD was competitive, from 1996 to 2011 they did record profit and bought 3 companies including ATI. • AMD CPU business suffered from: • Market manipulation from Intel. • Intel fined by EU, Japan, Korea, and settled with the USA • Foundry productivity and upgrade complications • AMD has changed • Global Foundries spun off as an independent business • Has developed 14nm &12nm, and is implementing 7nm fabrication • Intel late on 10nm, is less competitive than 7nm node • AMD to fabricate products using multiple foundries (TSMC, Global Foundries) • The market has changed • More AMD products are available on the Internet and both the adoption of the Internet and the size of the Internet retail market has exploded, thanks to the success of smartphones and tablets. • Consumer habits have changed, more people shop online each year. Traditional retailers have lost market share. • Computer market is larger (on-average), but has been declining. While Computer shipments declined in Q2 and Q3 2017, AMD sold more CPU’s. • AMD was uncompetitive with respect to CPU’s from 2011 to 2017. • Analysts look to GPU and Semi-Custom sales for revenue. • Cryptocurrency boom intensified, no crash occurred. • AMD did not increase GPU production to meet cryptocurrency demand. • Blockchain represents a new growth potential for AMD GPU’s. • Pessimism acts as security against a stock bubble & corresponding bust. • Creates cyclical volatility in the stock that can be used to generate profit. • P/E ratio is misleading when used to evaluate AMD. • AMD has long-term growth potential. • 2017 AMD releases competitive product portfolio. • Since Zen was released in March 2017 AMD has beat ER expectations. • AMD returns to profitability in 2017. • AMD taking measureable market share from Intel in OEM CPU Desktop and in CPU market. • High margin server product EPYC released in December 2017 before worst ever CPU security bug found in Intel CPU’s that are hit with detrimental 40% performance patch. • Ryzen APU (Raven Ridge) announced in February 2018, to meet gaming GPU shortage created by high GPU demand for cryptocurrency mining. • Blockchain is a long-term growth opportunity for AMD. • Intel is behind the competition for the next CPU fabrication node. AMD’s growing CPU advantage over Intel About AMD’s Zen Zen is a technical breakthrough in CPU architecture because it’s a modular design and because it is a small CPU while providing similar or better performance than the Intel competition. Since Zen was released in March 2017, we’ve seen AMD go from 18% CPU market share in the OEM consumer desktops to essentially 50% market share, this was also supported by comments from Lisa Su during the Q3 2017 ER call, by MindFactory.de, and by Amazon sales of CPU’s. We also saw AMD increase its market share of total desktop CPU’s. We also started seeing market share flux between AMD and Intel as new CPU’s are released. Zen is a technical breakthrough supported by a few general guidelines relating to electronics. This provides AMD with an across the board CPU market advantage over Intel for every CPU market addressed. 1) The larger the CPU the lower the yield. - Zen architecture that makes up Ryzen, Threadripper, and EPYC is smaller (44 mm2 compared to 151 mm2 for Coffee Lake). A larger CPU means fewer CPU’s made during fabrication per wafer. AMD will have roughly 3x the fabrication yield for each Zen printed compared to each Coffee Lake printed, therefore each CPU has a much lower cost of manufacturing. 2) The larger the CPU the harder it is to fabricate without errors. - The chance that a CPU will be perfectly fabricated falls exponentially with increasing surface area. Intel will have fewer high quality CPU’s printed compared to AMD. This means that AMD will make a higher margin on each CPU sold. AMD’s supply of perfect printed Ryzen’s (1800X) are so high that the company had to give them away at a reduced cost in order to meet supply demands for the cheaper Ryzen 5 1600X. If you bought a 1600X in August/September, you probably ended up with an 1800X. 3) Larger CPU’s are harder to fabricate without errors on smaller nodes. -The technical capability to fabricate CPU’s at smaller nodes becomes more difficult due to the higher precision that is required to fabricate at a smaller node, and due to the corresponding increase in errors. “A second reason for the slowdown is that it’s simply getting harder to design, inspect and test chips at advanced nodes. Physical effects such as heat, electrostatic discharge and electromagnetic interference are more pronounced at 7nm than at 28nm. It also takes more power to drive signals through skinny wires, and circuits are more sensitive to test and inspection, as well as to thermal migration across a chip. All of that needs to be accounted for and simulated using multi-physics simulation, emulation and prototyping.“ Is 7nm The Last Major Node? “Simply put, the first generation of 10nm requires small processors to ensure high yields. Intel seems to be putting the smaller die sizes (i.e. anything under 15W for a laptop) into the 10nm Cannon Lake bucket, while the larger 35W+ chips will be on 14++ Coffee Lake, a tried and tested sub-node for larger CPUs. While the desktop sits on 14++ for a bit longer, it gives time for Intel to further develop their 10nm fabrication abilities, leading to their 10+ process for larger chips by working their other large chip segments (FPGA, MIC) first.” There are plenty of steps where errors can be created within a fabricated CPU. This is most likely the culprit behind Intel’s inability to launch its 10nm fabrication process. They’re simply unable to print such a large CPU on such a small node with high enough yields to make the process competitive. Intel thought they were ahead of the competition with respect to printing large CPU’s on a small node, until AMD avoided the issue completely by designing a smaller modular CPU. Intel avoided any mention of its 10nm node during its Q4 2017 ER, which I interpret as bad news for Intel shareholders. If you have nothing good to say, then you don’t say anything. Intel having nothing to say about something that is fundamentally critical to its success as a company can’t be good. Intel is on track however to deliver hybrid CPU’s where some small components are printed on 10nm. It’s recently also come to light that Intel’s 10nm node is less competitive than the Global Foundries, Samsung, and TSMC 7nm nodes, which means that Intel is now firmly behind in CPU fabrication. 4) AMD Zen is a new architecture built from the ground up. Intel’s CPU’s are built on-top of older architecture developed with 30-yr old strategies, some of which we’ve recently discovered are flawed. This resulted in the Meltdown flaw, the Spectre flaws, and also includes the ME, and AMT bugs in Intel CPU’s. While AMD is still affected by Spectre, AMD has only ever acknowledged that they’re completely susceptible to Spectre 1, as AMD considers Spectre 2 to be difficult to exploit on an AMD Zen CPU. “It is much more difficult on all AMD CPUs, because BTB entries are not aliased - the attacker must know (and be able to execute arbitrary code at) the exact address of the targeted branch instruction.” Technical Analysis of Spectre & Meltdown * Amd Further reading Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel's 'garbage' patches | ZDNet FYI: Processor bugs are everywhere - just ask Intel and AMD Meltdown and Spectre: Good news for AMD users, (more) bad news for Intel Cybersecurity agency: The only sure defense against huge chip flaw is a new chip Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign Take-Away • AMD Zen enjoys a CPU fabrication yield advantage over Intel • AMD Zen enjoys higher yield of high quality CPU’s • Intel’s CPU’s are affected with 40% performance drop due to Meltdown flaw that affect server CPU sales. AMD stock drivers 1) EPYC • -A critically acclaimed CPU that is sold at a discount compared to Intel. • -Is not affected by 40% software slow-downs due to Meltdown. 2) Raven Ridge desktop APU • - Targets unfed GPU market which has been stifled due to cryptocurrency demand - Customers can upgrade to a new CPU or add a GPU at a later date without changing the motherboard. • - AM4 motherboard supported until 2020. 3) Vega GPU sales to Intel for 8th generation CPU’s with integrated graphics. • - AMD gains access to the complete desktop and mobile market through Intel. 4) Mobile Ryzen APU sales • -Providing gaming capability in a compact power envelope. 5) Ryzen and Threadripper sales • -Fabricated on 12nm in April. • -May eliminate Intel’s last remaining CPU advantage in IPC single core processing. • -AM4 motherboard supported until 2020. • -7nm Ryzen on track for early 2019. 6) Others: Vega, Polaris, Semi-custom, etc. • -I consider any positive developments here to be gravy. Conclusion While in the past Intel interfered with AMD's ability to bring it's products to market, the market has changed. The internet has grown significantly and is now a large market that dominates when in computer sales. It's questionable if Intel still has the influence to affect this new market, and doing so would most certainly result in fines and further bad press. AMD's foundry problems were turned into an advantage over Intel. AMD's more recent past was heavily influenced by the failure of the Bulldozer line of CPU's that dragged on AMD's bottom line from 2011 to 2017. AMD's Zen line of CPU's is a breakthrough that exploits an alternative, superior strategy, in chip design which results in a smaller CPU. A smaller CPU enjoys compounded yield and quality advantages over Intel's CPU architecture. Intel's lead in CPU performance will at the very least be challenged and will more likely come to an end in 2018, until they release a redesigned CPU. I previously targeted AMD to be worth $20 by the end of Q4 2017 ER. This was based on the speed that Intel was able to get products to market, in comparison AMD is much slower. I believe the stock should be there, but the GPU related story was prominent due to cryptocurrency craze. Financial analysts need more time to catch on to what’s happening with AMD, they need an ER that is driven by CPU sales. I believe that the Q1 2018 is the ER to do that. AMD had EPYC stock in stores when the Meltdown and Spectre flaws hit the news. These CPU’s were sold out by mid-January and are large margin sales. There are many variables at play within the market, however barring any disruptions I’d expect that AMD will be worth $20 at some point in 2018 due these market drivers. If AMD sold enough EPYC CPU’s due to Intel’s ongoing CPU security problems, then it may occur following the ER in Q1 2018. However, if anything is customary with AMD, it’s that these things always take longer than expected.
Will I earn money by mining? - An answer to all newcomers
When people start their adventure with Bitcoin, they often go through a small gold fever with the concept of mining (I would know, that's how I started ;) ). Here is a small guide to answer your eternal question "will I make money with it?": First of all, lets talk about hardware (click on the link for a long and useful list). You won't make money mining bitcoins unless you either have a really high-end GPU from ATI, an FPGA or an ASIC. That's the short answer. Having a decent CPU can be used for Litecoin mining, which can be a small income in itself, but we are here to talk about Bitcoin. To see whether you will earn any money, you need to input a few pieces of data into a special calculator:
cost of your hardware (cost of buying an ASIC, GPUs, motherboards, power supplies, etc.)
how fast can it hash (mega hashes per second). This you can get from your hardware list
your cost of electricity (check with your power company)
And then there are two magical variables that will either make it all work out, or be doomed for failure: * difficulty - it is automatically filled in by the calculator, but for long-term mining (more than a few weeks), you want to be a pessimist. Multiply the value by 10 for predictions over a few months or 100 for a year or two (it will rise steeply soon) * bitcoin price - also filled by the calculator - it might go up or down in the future, affecting your bottom line. It will probably increase in the long run, but lets be pessimistic and lower that to $10-$20 to make sure we are earning money no matter what Having all your hard data and your guesses on the last two variables, you put it all into the mining calculator and see what you get. You will get your earnings in BTC and dollars, as well as summary of your costs and when you will brake even, and what will your net income be over your investment period. Most likely you won't be earning money with Bitcoin mining, and that's okay - mining has become a very specialised process. If you want to invest money into new ASICs, you might be able to turn a tidy profit. TLDR: Use this to check everything. ASICs may earn you money, GPUs won't anymore.
https://preview.redd.it/5r9soz2ltq421.jpg?width=268&format=pjpg&auto=webp&s=6a89685f735b53ec1573eefe08c8646970de8124 What is Bitcoin? Bitcoin is an experimental system of transfer and verification of property based on a network of peer to peer without any central authority. The initial application and the main innovation of the Bitcoin network is a system of digital currency decentralized unit of account is bitcoin. Bitcoin works with software and a protocol that allows participants to issue bitcoins and manage transactions in a collective and automatic way. As a free Protocol (open source), it also allows interoperability of software and services that use it. As a currency bitcoin is both a medium of payment and a store of value. Bitcoin is designed to self-regulate. The limited inflation of the Bitcoin system is distributed homogeneously by computing the network power, and will be limited to 21 million divisible units up to the eighth decimal place. The functioning of the Exchange is secured by a general organization that everyone can examine, because everything is public: the basic protocols, cryptographic algorithms, programs making them operational, the data of accounts and discussions of the developers. The possession of bitcoins is materialized by a sequence of numbers and letters that make up a virtual key allowing the expenditure of bitcoins associated with him on the registry. A person may hold several key compiled in a 'Bitcoin Wallet ', 'Keychain' web, software or hardware which allows access to the network in order to make transactions. Key to check the balance in bitcoins and public keys to receive payments. It contains also (often encrypted way) the private key associated with the public key. These private keys must remain secret, because their owner can spend bitcoins associated with them on the register. All support (keyrings) agrees to maintain the sequence of symbols constituting your keychain: paper, USB, memory stick, etc. With appropriate software, you can manage your assets on your computer or your phone. Bitcoin on an account, to either a holder of bitcoins in has given you, for example in Exchange for property, either go through an Exchange platform that converts conventional currencies in bitcoins, is earned by participating in the operations of collective control of the currency. The sources of Bitcoin codes have been released under an open source license MIT which allows to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to insert a copyright notice into all copies. Bitcoin creator, Satoshi Nakamoto What is the Mining of bitcoin? Technical details : During mining, your computer performs cryptographic hashes (two successive SHA256) on what is called a header block. For each new hash, mining software uses a different random number that called Nuncio. According to the content of the block and the nonce value typically used to express the current target. This number is called the difficulty of mining. The difficulty of mining is calculated by comparing how much it is difficult to generate a block compared to the first created block. This means that a difficulty of 70000 is 70000 times more effort that it took to Satoshi Nakamoto to generate the first block. Where mining was much slower and poorly optimized. The difficulty changes each 2016 blocks. The network tries to assign the difficulty in such a way that global computing power takes exactly 14 days to generate 2016 blocks. That's why the difficulty increases along with the power of the network. Material : In the beginning, mining with a processor (CPU) was the only way to undermine bitcoins. (GPU) graphics cards have possibly replaced the CPU due to their nature, which allowed an increase between 50 x to 100 x in computing power by using less electricity by megahash compared to a CPU. Although any modern GPU can be used to make the mining, the brand AMD GPU architecture has proved to be far superior to nVidia to undermine bitcoins and the ATI Radeon HD 5870 card was the most economical for a time. For a more complete list of graphics cards and their performance, see Wiki Bitcoin: comparison of mining equipment In the same way that transition CPU to GPU, the world of mining has evolved into the use of the Field Programmable Gate Arrays (FPGA) as a mining platform. Although FPGAs did not offer an increase of 50 x to 100 x speed of calculation as the transition from CPU to GPU, they offered a better energy efficiency. A typical HD/s 600 graphics card consumes about 400w of power, while a typical FPGA device can offer a rate of hash of 826 MH/s to 80w of power consumption, a gain of 5 x more calculations for the same energy power. Since energy efficiency is a key factor in the profitability of mining, it was an important step for the GPU to FPGA migration for many people. The world of the mining of bitcoin is now migrating to the Application Specific Integrated Circuit (ASIC). An ASIC is a chip designed specifically to accomplish a single task. Unlike FPGAs, an ASIC is unable to be reprogrammed for other tasks. An ASIC designed to undermine bitcoins cannot and will not do anything else than to undermine bitcoins. The stiffness of an ASIC allows us to offer an increase of 100 x computing power while reducing power consumption compared to all other technologies. For example, a classic device to offer 60 GH/s (1 hashes equals 1000 Megahash. 1GH/s = 1000 Mh/s) while consuming 60w of electricity. Compared to the GPU, it is an increase in computing power of 100 x and a reduction of power consumption by a factor of 7. Unlike the generations of technologies that have preceded the ASIC, ASIC is the "end of the line" when we talk about important technology change. The CPUs have been replaced by the GPUs, themselves replaced by FPGAs that were replaced by ASICs. There is nothing that can replace the ASICs now or in the immediate future. There will be technological refinements in ASIC products, and improvements in energy efficiency, but nothing that may match increased from 50 x to 100 x the computing power or a 7 x reduction in power consumption compared with the previous technology. Which means that the energy efficiency of an ASIC device is the only important factor of all product ASIC, since the estimated lifetime of an ASIC device is superior to the entire history of the mining of bitcoin. It is conceivable that a purchased ASIC device today is still in operation in two years if the unit still offers a profitable enough economic to keep power consumption. The profitability of mining is also determined by the value of bitcoin but in all cases, more a device has a good energy efficiency, it is profitable. Software : There are two ways to make mining: by yourself or as part of a team (a pool). If you are mining for yourself, you must install the Bitcoin software and configure it to JSON-RPC (see: run Bitcoin). The other option is to join a pool. There are multiple available pools. With a pool, the profit generated by any block generated by a member of the team is split between all members of the team. The advantage of joining a team is to increase the frequency and stability of earnings (this is called reduce the variance) but gains will be lower. In the end, you will earn the same amount with the two approaches. Undermine solo allows you to receive earnings huge but very infrequent, while miner with a pool can offer you small stable and steady gains. Once you have your software configured or that you have joined a pool, the next step is to configure the mining software. The software the most populare for ASIC/FPGA/GPU currently is CGminer or a derivative designed specifically for FPGAS and ASICs, BFGMiner. If you want a quick overview of mining without install any software, try Bitcoin Plus, a Bitcoin minor running in your browser with your CPU. It is not profitable to make serious mining, but it is a good demonstration of the principle of the mining team.
*TL;DR Miner migration to litecoin, litecoin's similarity to proven bitcoin, relatively low market cap, Mt. Gox's API's will soon support litecoin, and litecoin's utility as a very liquid form of money is set to cause litecoin's price to skyrocket. * Read more at http://zamicol.blogspot.com/2013/04/why-i-think-litecoin-is-set-to.html Excerpts: Market Cap With 333201 blocks mined, and 50 litecoins per block, there is currently 16,660,050 litecoins in circulation. The current market price of $2 USD gives Litecoin a market cap of over $33 million. This may sound like a lot, but considering that after the “crash” of the past couple of days, Bitcoin’s market cap about $1,157 million USD with the current market price of $98 USD. If litecoin had the market cap of bitcoin, each litecoin would be worth over $69. Granted, this may not be a fair comparison, since in litecoin’s youth it has not experienced the same proportion of inflation as it’s older brother. While bitcoin is already over its halfway mark to generating its limit of 21 million bitcoins, litecoin is much younger and has not reached the halfway point to its limit of 84 million litcoin. If we factor in that there will be four times as many litecoins as bitcoins, each one of the 84 million litecoins would still be worth over $13 if litecoin had the bitcoin’s market cap. Litecoin Mining Difficulty A while back, my ears perked up at the prospects of litecoin because of a new technology is set to give the market a good shaking. As anticipated, the increase of difficulty of Bitcoin is causing traditional miners to look for more profitable outlets for their existing infrastructure. Hundreds of ASIC miners are now active, which has forced up the difficulty dramatically to an all time high of 7,673,000 with the next difficulty estimated to be near 9,000,000. News of the impending deployment of thousands more is forcing miners to rethink their allocation of existing infrastructure. Faced with bitcoin’s increasing mining difficulty, GPU Bitcoin miners have three options: Keep mining bitcoin at a potential loss as electricity costs become much greater than the return on mining. Turn off their miners and sell or retire their hardware. Look for more profitable applications for their existing infrastructure, such as litecoin. This is where the power of litecoin is very apparent. Due to it’s use of the memory intense scrypt algorithm, dedicated litecoin hardware like ASIC miners are not anticipated in the near future, giving GPU miners a window of opportunity for profit. Switching their GPU hardware from bitcoin to litecoin is only a matter of installing a new mining application and can be done with little configuration. The logical choice for most bitcoin miners will be to move their power to the litecoin network. In the short time I have been litecoin mining, I have seen the difficulty rise over 600%, meaning that there is six times more computing power dedicated to litecoin in only the past few months. This indicates that many miners have already made the realization that bitcoin offers them a bleak future and made the switch to litecoin. As these miners transition their hardware, litecoin’s mind share will increase, and it shouldn’t be a quickly passing event. Miners can take confidence in litecoin knowing that their infrastructure will be valuable for the foreseeable future, and this confidence is bound to poor over into the market price of litecoin. As an early bitcoin miner, I remember the supply of ATI 5870’s graphic cards quickly becoming unavailable as individuals bought up supply for use in bitcoin mining. This infrastructure still exists and it will not go to waste.
[Build Help] Dedicated Rig with 4x 7970 or 4x 7950 for Bitcoin Mining. [x-post from /r/bitcoin]
Hey Guys, I built my first rig using the advise you guys gave me - a GTX680 gaming rig. It is however extremely inefficient for mining Bitcoins and other cryptocurrencies. I wish to build a great ATI rig and put together two different set ups which essentially come down to a 4x7950 vs 4x7970. (it's almost impossible to get a 6990 let alone 4 of em). Here are the tentative builds here. (I used pcpartpicker) 4x7970:  http://pcpartpicker.com/usedracodraconis/saved/1snT 4x7950:  http://pcpartpicker.com/usedracodraconis/saved/1sqM Note that while the 7950 setup would have a lower hash rate, it also uses about 200W less power overall. I suppose, this may have a net benefit on electricity savings in the long run. I am willing to spend a good amount of cash in pursuing this but want to be prudent about it as well. Would there be any way to optimize either build even further? I.e. cost savings/performance boost etc? Also, is there a need for a riser card with my specific mobo? Will there be any notable difference in performance with/without it? If so, is there a particular brand I should get? For reference values on bitcoin hash rates, see here:  https://en.bitcoin.it/wiki/Mining_hardware_comparison It might seem like a pity to hook this up to a monitor and game on it, so I might do that at some stage :) Here is the thread on /Bitcoin Also, do you guys know of any video tutorials which show how to set up a 4way CrossFireX system? :D Thanks for this guys, and apologies for the long post! Cheers!
Help me get more mining out of my ATI Radeon 6450?
https://en.bitcoin.it/wiki/Mining_hardware_comparison From the mining hardware comparison, reports are in the neighborhood of 30 Mhash/sec using phoenix/phatk on windows 7 x64. I'm using windows server 2008 R2 x64, otherwise set up like that... and I'm seeing 4.4 MHash/sec. This is a dedicated server, graphics doing nothing. GPUz says there's 0% load on the GPU when I'm not mining, 100% when I am. I wouldn't think it's unreasonable to get slightly different numbers, but it's 85% off! Am I doing something wrong here? For comparison, a radeon hd 4550 is doing 7 Mhash/sec on the same box. phoenix -u http://username.worker:[email protected]:8332/ DEVICE=0 -k phatk VECTORS BFI_INT FASTLOOP=false AGGRESSION=9 ATI APP SDK v 2.4, phoenix 1.75, windows server 2008 R2 x64
from here and it would be running with roundabout 950 Kh/s whui, however, my workers nor my wallet nor my Dashboard would be updating after 8 hours of "mining" today: http://i.imgur.com/QoEpuni.png I didnt get any valid shares, nothing. I have no idea what I did do wrong. Can somebody please help me? I am out of options.
[modpost] Possible wiki page, something I call "All about miners," covering things from basic terminology to miner config files and overclocking.
What is a miner? A miner is a computer set up to solve cryptographic hashes in the litecoin network. Once a clump of these hashes, or a block, is mined, litecoins pop out! It's like opening a box of chocolates, except you know what you're gonna get :) Miners also handle transaction confirmations, making sure no single coin is double-spent. Setting up your computer to be a miner What kind of computer do I need? Optimally, you'd have a good power supply and a couple decent Radeon/ATI/AMD graphics cards. Because of litecoin's hash algorithm, the gap between mining with graphics cards and processors is less than with most other cryptocurrencies, meaning that mining with some desktop processors may be worth it after electricity costs. Note that mining with laptops is not recommended because of the heat generated by mining, and mining with NVIDIA graphics cards may not be worth the cost. How do I know if litecoin mining will be profitable for me? First, check how fast you'll be mining with your hardware, how many litecoins you'll mine in a day, and how much litecoins are worth. Now, multiply the number of litecoins per day by their worth. Then, find out the power draw of your hardware, and calculate energy cost. Then finish by subtract energy cost from your daily earnings. If your number is positive, you're making that much money per day. If negative, you're losing money. Keep in mind that the worth of litecoins goes up/down, and you have to earn the cost of your hardware before you churn a profit. Mining difficulty also goes up/down, depending on how many people are mining how fast in relation to how many litecoins are supposed to be generated how fast. See the economics(coming soon) post for more info. Okay, I did all that. How do I start? All you have to do is download a program and change some settings (later in the guide), and you're ready to go. If you're comfortable with configurations and the command line, Reaper and cgminer are your best friends. Otherwise, GUIMiner-scrypt is right for you. If you want to mine on your processor, download the "batteries included" miner via this link and setup should be relatively self-explanatory. Do I mine alone? Due to the difficulty of mining, we recommend that you mine with a pool where multiple people mine together. Visit your pool's about or help page for proper miner settings, which we're about to get to in-depth! Under the hood Configuring your miner (aka the hard part) Before we get started, you should become familiar with these terms:
host: Your pools website
port: The internet port your computer uses to connect to your pool
worker: Anything that mines is a worker. Just a way for you and your pool to keep track of what's mining how.
user: In mining programs, the user is the name of your worker, which by default tends to be poolusername.1 or poolusername_1, _2, etc.
pass: Password for your worker, NOT your pool password. This can usually be anything.
None of those will have any affect on how fast you mine. The settings that we'll be focusing on are:
worksize: Exactly what it sounds like
thread-concurrency: Setting that involves computations happening simultaneously
vectors: Involves how memory is used
aggression/intensity: How aggressively your computer mines
threads_per_gpu: How many threads of data to process on a GPU, like threads of a CPU. Anything beyond 1 usually doesn't increase hashrate on modern cards.
device: First GPU is device 0, second is device 1, etc.
If you're using GUIMiner-scrypt, there are default settings for different cards (lower right dropdown). I'm mining on a 7870. Here is what it looks like for me. You can follow along with the rest of this guide to optimize your settings. GUIMiner-scrypt is just a GUI to cgminer and reaper anyways. If you are using a command-line miner, like reaper and cgminer, I recommend you download and isntall Notepad++ or SublimeText if on Linux. Reaper is currently considered to be the best tool for mining. After you unzip your downloaded file, in the folder you'll find reaper.conf. It should look something like this:
As you see, my thread concurrency is slightly different from the default of GUIMiner-scrypt. I found that this concurrency gives me the best hashrate! NOTE: I do not use cgminer to mine litecoin. If you plan on using cgminer, which offers more hardware-controlling settings, in the cgminer folder you will want to create a text file. Then, open that text file w/ Notepad++ or SublimeText, then Save As > cgminer.con > file type > all. This will save the file with the proper name and as the proper type. Note that cgminer does not support high concurrencies. For me, cgminer.conf would look something like:
You saw some settings similar to what we saw in Reaper's litecoin.conf. The other settings have to do with my card's clocks, voltage, and fan. This is covered in the overclocking section right below! Overclocking (aka the risky part) Okay, first off I'm not responsible if you cause damage to your parts. Please research safe overclock settings for your card. Second, don't be afraid. Modern hardware has many safety features in place that help prevent mayhem like me...lol jk this isn't a car insurance add. For your better understanding, become familiar with these terms:
Voltage/vddc: Amount of electrical current supplied to your card
Power Limit: Determines at what temperature your card throttles itself
Core Clock: Speed of your memory's core, similar to CPU core clocks
Memory Clock: Speed of GPU's GRAM, similar to RAM speed
Fan speed (%): Determines the RPM of your fan once your card reaches certain temperatures.
No one setting controls how effectively you mine; what matters most when it comes to clocks is the ratio between your core/memory clocks. Generally, a ratio of 0.7 or below is best. You will need to experiment. If you're using cgminer, you can control card settings from the conf file. However, if you aren't, I recommend using MSI Afterburner as your overclocking tool. You will need to unlock some settings. Using my cgminer settings, MSI Afterburner looks like this. I have found these settings to be the most stable while bringing me a high hashrate. Other people's optimum settings You can check the sidebar for the hardware comparison chart, but it is rarely updated and has huge sways in results. It is a good starting place. The mods of this subreddit will be putting together an updated, more accurate list in the near future. END I hope all things go smoothly for you and that you've learned a lot! Please consider donating LTC to My wallet: LiD41gjLjT5JL2hfVz8X4SRm27T3wQqzjk The writer of the [Consolidated Litecoin Mining Guide] which helped get me started The writer of the [Absolute Beginner's Litecoin Mining Guide] which also helped me get started
Am I missing something? Hardware investment vs Currency investment
EDIT: I understand that BFL products are on back order, and upon ordering now I probably won't have them for 60-120 days. That is a risk I will accept, and it doesn't fundamentally change my question. It will certainly change the math behind it, though, and as I'm still quite fledgling in this, I'm interested to hear the estimates of those who have more research. Things like the estimated growth in difficulty due to hash increase (current total power + sum of all BFL preorders would be a good start), next specific decrease in BTC per block, etc. which I will research further once I'm off work. EDIT2: Found the missing parts: underestimating the increase in difficulty at the time I am likely to recieve hardware from BFL due to overestimating the timely arrival time of BFL hardware. I am thinking more that the investment (~$3-5000) I wanted to make would be better spent redistributed a few different ways (All of these would be preceeded with more research):
Maybe 1-2 Jalep's on preorder depending on how fast BFL's fulfillment continues to progress over the next couple weeks
~$1000 in BTC, for starters, sometime within the next week; as difficulty increases and BTC continues to gain acceptance as a currency, potential for good return seems to exist.
An ATI Graphics card I can use for gaming right now that I can use for BTC mining during work/sleep. I game a lot, and I opted for one strong nVidia GPU when I built my system, so I've both got room for and will make use of another card. After a quick bit of research it looks as though mixing nV + AMD cards is not a problem.
I'd heard about bitcoin before, but hadn't really done any research until a couple days ago. I've been looking to invest some spare cash (I already have a Roth and 2 other IRAs, plus a savings net). So far based on all the calculations I've done, BFL bitcoin hardware will pay for itself even at conservative estimates, like this: Using the bitcoinx profit calculator I simulated a 4-fold increase in difficulty (as ASIC systems become more widely distributed) by taking the block reward from 25 to 6.25 BTC. Further, I took the value to 75 USD/BTC, and the hash rate to 90% efficiency. The GH/s per $ is pretty equal across the BFL product line, and with these numbers the hardware breaks even at ~99 days. At present I do not pay for my own electricity. However, I've found that adding electricity costs makes relatively little difference even at wildly exaggerated consumption values. Using 1KW at @ $0.1 per KWh (approximately equal to leaving your microwave on all day, or your clothes dryer on for 6-10 hours per day) only added 10 days to break even time. What am I missing? Why are so many people suggesting that others just buy bitcoins? Is that coming from only a vocal minority trying to keep others out of the game to help their hardware investment stay profitable? No hate on that by the way--It makes sense. I'm honestly just asking. I'm interested in BTC for the sake of watching an alternate currency and tracking an investment. However, if there isn't something missing here, all those suggesting buy-in for the past months--and it's been months of people saying the same thing: "it used to be good, but you'd better just buy in now"--aren't looking at the numbers. Related tangent, inquisitive: why does the bitcoinx mining hardware comparison page show 30.2/46.2/46.1 for the BFL products' MH/s/$ value? Those numbers don't add up.
[Build Help] $1500 gamer, multi-screen capability, with some scryptcoin mining/protein folding/video editing on the side
I tried this in /buildapcforme and didn't get any response, but I don't think I need help from scratch since I already have an idea of what I want in terms of parts, so here's a shortened copypaste: Uses:
Modest gaming - No more than 1080p gaming as high of quality before framerates dip below 40. I play poorly optimized FPS games (BF4, Arma/DayZ), other common titles (GTA5, CSGO, Skyrim, indies) and love to crank out stupid high quality graphics out of Flight Sim X, hence the want for triple monitors
Future proof. This is the last computer I want for 5-10 years, but I'm not opposed to upgrading individual parts.
triple-monitor capability as a desire but not a requirement
Bitcoin/Scryptcoin (Litecoin/Dogecoin (very moon)) mining and protein folding in the meantimes. I'm not looking to make a large profit by any means, so don't include this in your build thoughts if you'd rather not worry about it. I just want to put my PC to good use when I'm not using it.
HD video editing.
Matlab and CAD (Solidworks & Electromagnetic modeling)
Dual-boot with various linux partitions.
Budget: As cheap as possible. I'll go over no more than $2000 if need be for the triple-monitor capability.
Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?
Down the line. I'd like to see more about benefits of OCing.
If there's any specific features you want/need from the rig, please list them.
Good cooling is a must for high GPU intensive processes (mining, folding).
Watercooling is a plus if it provides better performance than air.
SSD = YES.
Do you have any specific case preferences such as a window or LEDs, or do you have a preference for low-noise components?
Don't care about having a snazzy case...i'd make one out of plywood tbh.
USB and audio on front.
Noise isn't a big deal.
I've read the logical increments guide, and it looks like the enthusiast level is right for me. I'd like to get the price under $1,500 though. I want a PC that is at the knee of the price vs. performance curve.
Prices include shipping, taxes, and discounts when available.
Generated by PCPartPicker 2013-12-25 05:31 EST-0500
With that build, what are your thoughts on the 2x R9 280x's? I'm having trouble finding comparable benchmarks, other than this which only compares it to lower-scoring GPUs. I see them as providing a fair amount of number crunching capability (aka mining), as well as being still top-of-the-line graphics for gaming. I probably shouldn't concern myself with mining if I can get a better performance out of games with a single GPU for less $. I've never SLI'd or CF'd and everywhere I see it mentioned, I notice people have problems with it or that it doesn't always work. What should I know about multi-GPU setups? I notice a lot of budget builds use an AMD CPU. How much can I expect out of the 4770k over a similar AMD cpu like the FX-9590 or 8350 (a much cheaper one), aside from the marginal chance in benchmark scores? I could save getting a cheaper PSU probably. I live near a Microcenter. I'm guessing their prices are lower for pickups. Sorry for the long post. I hope it doesn't scare anyone off! I'm just not going to drop $1,500 without properly educating myself first. Thanks!
Why do more expensive GPUs have worse MH/s rates than less expensive GPUs and graphics cards?
For example, on this page of mining hardware from the sidebar, the NVidia Tesla K20 gets 134.8 Mhash/s and costs over $3,000.00, while an ATI 7970 gets 825 Mhash/s and costs about $380.00. Presumably the NVidia has a higher flop rating than the ATI, so why doesn't the more powerful NVidia have a higher mining rate? Is it that the NVidia simply isn't made for the kind of computing involved in bitcoin mining? EDIT: The ATI card has the following specs
3.79 TFLOPS Single Precision compute power 947 GFLOPS Double Precision compute power 2048 Stream Processors
The NVidia Card has the following specs:
CUDA cores 2496 Peak double precision floating point performance 1.17 Tflops Peak single precision floating point performance 3.52 Tflops
So the Flops are comparable overall, and the NVidia card has more cores. I don't understand how the ATI gets over 6 times the Mhash/s than the Nvidia does.
My computer currently gets like 20MHash/s. I am debating whether or not to get an ATI 6990 because it supposedly gets 700, which with my rough math, should get like .0392 BTC a day, or currently roughly $4.90 USD. If I get a BitForce SC that gets 4.5-5GHash/s, I'd be getting .25 BTC a day, or roughly $34 USD per day. If I get four of them, that's like 1 BTC or $136 USD a day. I'm just a little weary, as in if someone could make machines to make that much BTC a day, why'd they sell them? Also, it seems as thought this money is coming out of thin air. If I understand correctly, but I guess my question is why more people don't do this then. Not sure if this has been asked before, but I haven't seen it. Chart I'm using for my MHash/s numbers: https://en.bitcoin.it/wiki/Mining_hardware_comparison Calculator: http://dev.bitcoinx.com/profit/
I've been reading extensively on bitcoin for the past few days and there are a few holes here and there that I'm trying to understand. So far it makes me believe that this whole mining thing is some sort of elaborated scam. Here is a few unorganized points that are confusing to me. Why are other currencies like litecoin slowly becoming popular and why people want them to become popular? What purpose do litecoin serves that bitcoin doesn't? If the second to bitcoin, a redundant currency like litecoin becomes popular and that people want it to become popular, then what's stop more of those currencies to all becoming popular making each of them just spammy/redundant at the end, don't we only need one of those currencies to serve the purpose of worldwide decentered transactions? The so popular and referred mining hardware comparison sheet gives a list of videocard that are recommended to use. Combined with this calculator people can make some calculations to see if they should invest electricity cost into mining. Now it seems to yield a little free income at the end of the month, all seems well until you investigate further. The power consumption of your videocard shown there is, in the radeon 6850 case, only half of what it truly use at full usage. Now to add to this, I've been mining for more than a day at full power without stopping nor interfering in the process. It tells me I should be making 0.0125 bitcoin a day but I barely made half of that in a bit more than a day, yet I am positive the videocard ran at full strength for the whole process. Now, double electricity cost vs half production, it becomes almost a profitless operation. To this, combined that the current bitcoin value is tenfold what it was 3 months ago, how could it have been profitable back then if it is not right now? Now another suspicious part to me is those 2 websites 12. They offer what every person would ever want, a way to make a lot of money easily. Both of them deliver their products months after purchase and, the 2nd website especially, is selling something that would potentially pays for itself back in less than a month, after which huge profit would come in. How convenient, to sell something that yields huge profit and pays itself back so quickly, better sell than use ourselves right? The first site has sold out, and funnily enough are selling the next batch for 75 bitcoin per... which they could just mine themselves faster than their delivery time, so what's their gain really? Conveniently we have the 2nd website, not sold out, selling something similar to the 1st website, without any pictures of what the behind of their miner looks like, who won't mention anywhere the power consumption of their product but say that it comes with a usb cord, plug and play! That sure not sound fishy at all since the asic counterpart is shown on the comparison sheet as using 600 W, for sure the usb connector hole can output that kind of power right? Hopefully someone can shed some light on all this to the better understanding of least common asked matters, yet quite important for anyone who wants to jump in this... bandwagon... I'm legitimately trying to see things optimistically but so far I only see a few root members trying to scam the entire world by projecting this half legit currency world unto us. Note: sorry for my relatively poor english, I tried putting my thoughts into word as precisely as I could, but I couldn't do it as well as I wish I could.
Será que vou ganhar dinheiro com mineração? – Uma reposta para novatos em Bitcoin
Quando as pessoas começam a se aventurar com o Bitcoin, elas geralmente entram numa pequena “febre do ouro” com o conceito de mineração (Eu sei bem como é, eu comecei assim ;) ). Aqui daremos um pequeno guia para responder à eterna pergunta: “eu farei dinheiro com isso?”. Em primeiro lugar, vamos falar sobre hardware [https://en.bitcoin.it/wiki/Mining_hardware_comparison] (clique no link para uma longa e útil lista). Você não irá fazer dinheiro minerando bitcoins a não ser que você tenha uma GPU da ATI, FPGA ou ASIC. Esta é uma resposta curta. Tendo uma CPU decente você pode minerar Litecoin [htpp://litecoin.org/], que pode ser uma pequena fonte de renda, mas nós estamos aqui para falar sobre Bitcoin. Para ver se você vai ganhar algum dinheiro, você precisa colocar algumas pequenas informações numa calculadora especial [http://tpbitcalc.appspot.com/]:
custo do seu hardware (custo de comprar uma ASIC, GPUs, placas mãe, fontes de alimentação, etc.)
E então existem duas variáveis mágicas que vão ou fazer tudo funcionar perfeitamente, ou condenar tudo ao fracasso: dificuldade – isto é automaticamente preenchido pela calculadora, mas para mineração por longos períodos (mais do que algumas semanas), você deve ser pessimista. Multiplique o valor por 10 para previsões acima de alguns meses ou 100 por um ano ou dois (isso vai subir em breve) *preço do bitcoin – também automaticamente preenchido pela calculadora – ele pode subir ou descer no futuro, afetando seu resultado final. Ele provavelmente subirá a longo tempo, mas vamos ser pessimistas e baixar esse preço para US$10-20 para ter certeza de que ganhemos dinheiro não importa o que aconteça. Dedique todo o seu espaço de disco e palpites nessas duas variáveis, coloque tudo isso na calculadora de mineração e veja no que dá. Você terá seus ganhos em BTC e dólares, bem como um resumo de seus custos e ainda quando você pode ficar sem dinheiro, e qual será seu lucro líquido depois de seu período de investimento. O mais provável é que você não irá conseguir fazer dinheiro com mineração, e está tudo bem – mineração se tornou um processo bastante especializado. Se você quiser investir seu dinheiro em um novo ASIC, você pode ser capaz de conseguir de obter um lucro considerável. Um resumo rápido para os preguiçosos: Use isso [http://tpbitcalc.appspot.com/] para verificar tudo. ASICs merecem o investimento de seu dinheiro, GPUs não.
Modest < $1,500 Gamer, Scrypt mining on the side, with dual/triple monitor capability.
What will you be doing with this PC? Be as specific as possible.
Modest gaming - No more than 1080p gaming - BF, CSS, ARMA/DayZ, as high of quality before framerates dip below 40.
Future proof. This is the last computer I want for 5-10 years, but I'm not opposed to upgrading individual parts.
triple-monitor capability as a desire but not a requirement
Bitcoin/Scryptcoin (Litecoin/Dogecoin (very moon)) mining and protein folding in the meantimes. I'm not looking to make a large profit by any means, so don't include this in your build thoughts if you'd rather not worry about it.
HD video editing.
Matlab and CAD (Solidworks & Electromagnetic modeling)
What is your maximum budget before rebates/shipping/taxes?
As cheap as possible. I'll go over no more than $2000 if need be for the triple-monitor capability.
When do you plan on building/buying the PC?
What, exactly, do you need included in the budget? OS, peripherals, wifi, in addition to the tower.
Which country will you be purchasing the parts in? If you're in US, do you have a Microcenter?
St. Louis MO US. Yes, there is a microcenter. Is that a good thing? Never been.
If reusing any parts (including peripherals), what parts will you be reusing? Brands and models are appreciated.
No. I've been suffering with an Asus G60 laptop for the most of my college life.
Will you be overclocking? If yes, are you interested in overclocking right away, or down the line? CPU and/or GPU?
Down the line. I'd like to see more about benefits of OCing.
If there's any specific features you want/need from the rig, please list them.
Good cooling is a must for high GPU intensive processes (mining, folding).
Watercooling is a plus if it provides better performance than air.
SSD = YES.
Do you have any specific case preferences such as a window or LEDs, or do you have a preference for low-noise components?
Don't care about having a snazzy case...i'd make one out of plywood tbh.
USB and audio on front.
Do they make cases with carryhandles? I move a lot.
Noise isn't a big deal.
Do you already have a legit and reusable/transferable OS key/license? If yes, what OS?
Yes a part of my college MSDN agreement. Win7/8.
I've read the logical increments guide, and it looks like the enthusiast level is right for me. I'd like to get the price under $1,500 though. I want a PC that is at the knee of the price vs. performance curve.
Noob starting point - get started mining (with GPUs)
With the recent influx of essentially identical posts asking the same things over and over I thought I'd try and help though I'm just a novice. For starters, if you want to mine on your PC you'll ideally want any higher end ATI/AMD video cards. Nvidia cards do work but are nowhere near as fast and if you're concerned with cost of production you probably won't break even on Nvidia. The BitCoin Wiki Mining comparison chart is a great place to get an idea of what your video card will be capable of. Electrical costs should be kept in mind as well. I'd highly recommend a Kill-a-Watt for measuring power consumption. It should be pretty safe to say by now that you shouldn't buy video cards explicitly for mining at this time if you have any intention of recouping your cost. Hopefully if you're reading this you know about ASICs already and the impact they'll have once they are in wider production. Setting up your card(s) for mining: power and cooling are both important here. I've recommended MSI Afterburner repeatedly because of its capabilities. Don't bother with CCC. Afterburner works with any card vendor. With it you can control clock speeds, memory speeds, and voltage if the card supports it. Voltage is important if you're going for higher clock speeds. Use caution when tweaking your voltage and clock speeds. Adjust clock speed in increments and start mining. If your video driver crashes within a short amount of time add a touch of voltage. Some cards just can't go very far over their stock speeds. Most people drop the memory speeds which helps reduce power consumption and can increase stability. I've read where if the difference is too great between GPU speed and memory speed that can cause issues as well. Again, it can't be stated enough, fine tune your settings in small steps to find out what your card can do. Afterburner will also give you more control over your fans than CCC allowing you to keep your card(s) cool. You can mix and match different family cards if you have them and you don't run Crossfire when mining. Outside of this it comes down to your favorite pool, miner, and settings for that miner. I'd recommend GUIminer and maybe even Slush's Pool for keeping it simple and go from there.
Cryptocurrency comparison. Compare cryptocurrencies against each other and start trading cryptocurrency CFDs with IG. We offer nine of the most popular cryptocurrencies, including bitcoin, ether, litecoin, ripple, EOS, stellar (XLM) and NEO. The differences between each cryptocurrency can offer insights into how the value of each coin will ... Live Anti Bitcoin prices from all markets and ANTI coin market Capitalization. Stay up to date with the latest Anti Bitcoin price movements and forum discussion. Check our our snapshot charts and see when there is an opportunity to buy or sell. Wer Kryptowährungen wie Ethereum oder Bitcoin minen will, kommt um eine starke Grafikkarte nicht herum. Wenn ihr gerade dabei seid, euren Rechner.. NVidia GeForce GPU cards are surprisingly slow in comparison to AMD ATI Radeon HD. It's pointless to run bitcoin mining on NVidia cards. Here list of most effective cards to run mining: Best performance/cost ratio have ATI/AMD Radeon HD 5xxx series cards: Radeon HD 5750 ~140 MHash/s - 90W Radeon HD 5770 ~190 MHash/s - 110W Radeon HD 5830 ~260 MHash/s - 175W Radeon HD 5850 ~290 MHash/s - 151W ... People use to call Bitcoin the digital gold and to be honest, I used this comparison myself in the past. To explain the world’s first cryptocurrency to newbies as both assets have a limited supply and share the process of “mining” that generates more of the asset. But I concluded that there is a major problem with this comparison. Besides the above-mentioned similarities that are not ...
LITECOIN (LTC) hardware mining review comparison charts - performance of AMD RADEON R9 series - R9 270X VS R9 280X VS R9 290 VS R9 290X measured and compared. You can mine LITECOINS LTC`s with ... DOGECOINS hardware mining review comparison charts - performance of AMD RADEON R9 series - R9 270X VS R9 280X VS R9 290 VS R9 290X measured and compared. You can mine DOGE COINS with your graphics ... Parts Used In This Video: The GPUs: https://geni.us/a1ij2Vx Corsair 450w ATX PSU: https://geni.us/TodEZD The Best Mining Motherboard: https://geni.us/tAHmm I... Remove all; Disconnect; The next video is starting This video is unavailable. Watch Queue Queue