First, I'm speaking for my self here, I am just one volunteer of many who try to answer the questions posted. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company other that deals with Apple or Apple products).
+
First, I'm speaking for my self here, I am just one volunteer of many who try to answer the questions posted. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any other company that deals with Apple or Apple products).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem I have is knowing how many systems sold are effected (percentage), while I'm not disagreeing people are have problems, can it all be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They (Apple as well as other companies) who had problems with the NVIDIA GeForce 8600M GT chip went after NVIDEA. Apple then swapped out the logic board for their customers with NVIDEA paying for the repairs (one could argue Apple didn't run the program soon or long enough before they EoL the systems effected). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples hardware partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would only do the BGA chips that get excessively hot) would help reduce the tin whiskers problem if that is the cause, a more basic one is just the crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be just guilty of not projecting out far enough what we expected the systems to do. As an example the visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 series models.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
First, I'm speaking for my self here, I am just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company other that deals with Apple or Apple products).
+
First, I'm speaking for my self here, I am just one volunteer of many who try to answer the questions posted. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company other that deals with Apple or Apple products).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem I have is knowing how many systems sold are effected (percentage), while I'm not disagreeing people are have problems, can it all be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They (Apple as well as other companies) who had problems with the NVIDIA GeForce 8600M GT chip went after NVIDEA. Apple then swapped out the logic board for their customers with NVIDEA paying for the repairs (one could argue Apple didn't run the program soon or long enough before they EoL the systems effected). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples hardware partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would only do the BGA chips that get excessively hot) would help reduce the tin whiskers problem if that is the cause, a more basic one is just the crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be just guilty of not projecting out far enough what we expected the systems to do. As an example the visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 series models.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
+
First, I'm speaking for my self here, I am just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company other that deals with Apple or Apple products).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
-
The problem is how many systems sold are effected (percentage) while I'm not disagreeing people are have problems can it be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
+
The problem I have is knowing how many systems sold are effected (percentage), while I'm not disagreeing people are have problems, can it all be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
-
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They (Apple as well as other companies) who had problems with the NVIDIA GeForce 8600M GT chip went after NVIDEA. Apple then swapped out the logic board for their customers with NVIDEA paying for the repairs (one could argue Apple didn't run the program soon or long enough before they EoL the systems). At that time point the amount of systems in the wild effected was a lot less than here.
+
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They (Apple as well as other companies) who had problems with the NVIDIA GeForce 8600M GT chip went after NVIDEA. Apple then swapped out the logic board for their customers with NVIDEA paying for the repairs (one could argue Apple didn't run the program soon or long enough before they EoL the systems effected). At that time point the amount of systems in the wild effected was a lot less than here.
-
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples building partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
+
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples hardware partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
-
While using leaded solder (I would state only on the BGA chips that get excessively hot) would help reduce the tin whiskers if that is the cause, a more basic one is just crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
+
While using leaded solder (I would only do the BGA chips that get excessively hot) would help reduce the tin whiskers problem if that is the cause, a more basic one is just the crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
-
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be guilty of just not projecting out far enough what we expected the systems to do. The visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 model.
+
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be just guilty of not projecting out far enough what we expected the systems to do. As an example the visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 series models.
-
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing the todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
+
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem is how many systems sold are effected (percentage) while I'm not disagreeing people are have problems can it be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
-
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They Apple went after NVIDEA as well as other companies who also had problems with the NVIDIA chip. Apple then swapped out the logic board for their customers (while one could argue Apple didn't run the program soon or long enough before they EoL the system). At that time point the amount of systems in the wild effected was a lot less than here.
+
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They (Apple as well as other companies) who had problems with the NVIDIA GeForce 8600M GT chip went after NVIDEA. Apple then swapped out the logic board for their customers with NVIDEA paying for the repairs (one could argue Apple didn't run the program soon or long enough before they EoL the systems). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples building partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would state only on the BGA chips that get excessively hot) would help reduce the tin whiskers if that is the cause, a more basic one is just crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be guilty of just not projecting out far enough what we expected the systems to do. The visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 model.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing the todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
+
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find the IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem is how many systems sold are effected (percentage) while I'm not disagreeing people are have problems can it be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They Apple went after NVIDEA as well as other companies who also had problems with the NVIDIA chip. Apple then swapped out the logic board for their customers (while one could argue Apple didn't run the program soon or long enough before they EoL the system). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples building partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would state only on the BGA chips that get excessively hot) would help reduce the tin whiskers if that is the cause, a more basic one is just crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be guilty of just not projecting out far enough what we expected the systems to do. The visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 model.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing the todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in any way by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
+
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in anyway by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem is how many systems sold are effected (percentage) while I'm not disagreeing people are have problems can it be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They Apple went after NVIDEA as well as other companies who also had problems with the NVIDIA chip. Apple then swapped out the logic board for their customers (while one could argue Apple didn't run the program soon or long enough before they EoL the system). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples building partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would state only on the BGA chips that get excessively hot) would help reduce the tin whiskers if that is the cause, a more basic one is just crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be guilty of just not projecting out far enough what we expected the systems to do. The visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 model.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing the todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.
Firstly, I'm speaking for my self here, I am a just one volunteer of many who try to answer the questions posted here. I am not paid in any way by IFIXIT or Apple. Or, in the case of Apple have any affiliation with them other than a user of their products and one who services them. Given the fact there is litigation pending I don't think you will find IFIXIT folks directly responding to your questions (nor any company that deals with Apple).
Personally, I think there is enough blame to pass around here: Apple, the GPU supplier, the EU demanding lead be removed from solder in all cases, users who fail to allow the system to breathe, and in some cases the applications running on the system.
The problem is how many systems sold are effected (percentage) while I'm not disagreeing people are have problems can it be attributed to something Apple did or should have known? and then the question is, when did they know and what should have been the corrective action?
Remember, Apple was stung on a similar problem with the older MacBook Pro's (i.e. A1226) with the NVIDIA GeForce 8600M GT. They Apple went after NVIDEA as well as other companies who also had problems with the NVIDIA chip. Apple then swapped out the logic board for their customers (while one could argue Apple didn't run the program soon or long enough before they EoL the system). At that time point the amount of systems in the wild effected was a lot less than here.
Are we so sure Apple dropped the ball here still again? Its it possible the GPU suppliers altered things after the initial builds of the system (as was the case of NVIDEA before). Did Apples building partners switch out the thermal paste to a cheaper less effective one? And lastly, has anyone done any of the deep analysis here to understand what is really going on?
While using leaded solder (I would state only on the BGA chips that get excessively hot) would help reduce the tin whiskers if that is the cause, a more basic one is just crystallization of the solder which alters from a conductor to a semiconductor from excessive heat over a period of time (often called a cold solider joint).
Using better thermal paste and more aggressive heat dissipation (heat sink and fan system) might help. But, that assumes the heat transfer to the heat sink fins was not working correctly in the current design. It's possible the GPU suppliers low balled the thermal mass needed to keep the GPU from running too hot. Maybe the GPU suppliers should have done a better design or used different materials so the chip would run cooler.
Maybe the app developers should have used different coding methods so the GPU didn't need to work as hard. Remember the use of our computers has evolved. Apple could be guilty of just not projecting out far enough what we expected the systems to do. The visual effects in many of todays games are just mind blowing! And that is just in the last 3~4 years which is about the time point Apple started the design of the '11 model.
And, lastly the users who push the limits of the system maybe expecting too much here. Running their systems with the vents clogged by the sheets and blankets of ones bed as an example playing the todays most CPU/GPU demanding game. I'm guilty of doing this my self. I can't blame Apple for my misuse.
I should point out I am very supportive of removing Lead from our environment as much as possible as its clear it is a poison to not just us, but animals and fish as well. But swinging to the other extreme and only within the tech sector I think is a bit much. If we are serious here with Lead we need to address car batteries and other larger sources better. Using lead solder selectively in small quantities is not that large a risk considering the amount of Lead in a CRT (about 1/3 the weight of it!) which we no longer use.