Announcement

Collapse
No announcement yet.

Objective-C programming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Objective-C programming

    This article in the NYT describes making a good living from home, if you can program a game for the iPhone: http://www.nytimes.com/2009/04/05/fa...=iPhone&st=cse. I read that with interest, because I thought of a game many years ago, a variant on the Rubik's Cube. It's based on a different "crystallography" but has same general idea. It was too complicated for me to machine a prototype in 1982, but it could work as a virtual puzzle on an LCD screen.

    That imaginary road to riches seems as distant as ever, though, since these iPhone games are programmed in Objective-C. Learning this could be really tough, especially for somebody like myself with no experience in object-oriented languages.

    What is your experience with this? Is the leap from simple languages like Fortran or Basic to object-oriented programming as difficult as I expect it to be?
    Allan Ostling

    Phoenix, Arizona

  • #2
    Originally posted by aostling
    This article in the NYT describes making a good living from home, if you can program a game for the iPhone: http://www.nytimes.com/2009/04/05/fa...=iPhone&st=cse. I read that with interest, because I thought of a game many years ago, a variant on the Rubik's Cube. It's based on a different "crystallography" but has same general idea. It was too complicated for me to machine a prototype in 1982, but it could work as a virtual puzzle on an LCD screen.

    That imaginary road to riches seems as distant as ever, though, since these iPhone games are programmed in Objective-C. Learning this could be really tough, especially for somebody like myself with no experience in object-oriented languages.

    What is your experience with this? Is the leap from simple languages like Fortran or Basic to object-oriented programming as difficult as I expect it to be?
    Hi Allan,
    I'm a developer who worked on the iPhone 1.0 and 2.0 software -- including creating those objective-c APIs that you would look to be using.

    Fortran and basic are procedural languages, and it is quite a step to move from them to an object oriented language. But, of all the languages to learn, obj-c is basically just a step above C, with not too many fancy things added to the language (unlike C++ or Java).

    Downloading the SDK is free --- as long you have a mac, it doesn't hurt to sign up for ADC, download the SDK, and start compiling the examples and running through tutorials to figure out how to do stuff.

    corbin

    (Beginner machinist, but advanced programmer -- I work on Cocoa at Apple)

    Comment


    • #3
      Originally posted by corbin

      Downloading the SDK is free --- as long you have a mac, it doesn't hurt to sign up for ADC, download the SDK, and start compiling the examples and running through tutorials to figure out how to do stuff.
      corbin
      Corbin,

      Thanks, this is encouraging. I do have a Mac, so I'll have a look at this. I may even decide to buy an iPhone, to get some exposure to this environment.
      Allan Ostling

      Phoenix, Arizona

      Comment


      • #4
        What is your experience with this? Is the leap from simple languages like Fortran or Basic to object-oriented programming as difficult as I expect it to be?
        Yes. I found it difficult to grasp the idea that essentially everything is event driven and asynchronous. Tasks ("methods") run in their own time slice and when they finish they cough up the result. Your main loop essentially sits around waiting for stuff to become available and then sends it off to where ever it belongs. I would start perhaps with Visual Basic as the code itself is familiar, mostly, but it is a fully object oriented language and really doesn't owe much to BASIC of the old days. There are quite a few changes in key words and the syntax is a lot different.

        For instance, this code fragment resets some variables to a known state. But, to make sure that they are really reset before execution continues I issue a command for the interpreter to sychronize all pending events by polling everything for finished tasks. That's the "Application.DoEvents" method call at the end of the routine.

        Code:
        'RESET
            Private Sub Button4_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles ResetButton.Click
                halt = 1
                flag = False
                UD1.Value = 1
                ud1shadow = 1
                ud2.Value = 1
                pending = 1
                exposed = 0
                totaltimeLed.Text = "00:00"
                frames = 0
                closecom()
                RunButton.Enabled = True
                Application.DoEvents()
            End Sub
        Free software for calculating bolt circles and similar: Click Here

        Comment


        • #5
          Alan, Objective-C is a predecessor to C++ -- it's a blend of C and Smalltalk.
          Most of it is straight C, with embedded Smalltalk. If you're comfortable with C and C++, Objective-C is pretty straightforward:

          [anObject aMethod]
          in Objective-C is the same as anObject->aMethod() in C++.

          I looked into developing some machinist apps for the iPhone (speeds and feeds, decimal equivalence tables, tap and drill chart, press and interference fits table, ....) awhile ago: you need to have a Mac obviously, for the Objective-C environment, and you need to buy the $99 iPhone SDK from Apple.

          If you're interested, Apple and Stanford are posting their 10-week iPhone developer's course as online course-ware:

          http://blog.wired.com/gadgets/2009/0...-stanford.html
          "Twenty years from now you will be more disappointed by the things that you didn't do than by the ones you did."

          Comment


          • #6
            Robert,

            I am building a new computer and have been out of the loop for a while. I want your Intel-centric opinion on what is the best CPU out there for severe number crunching on stuff like CAD and Raytracing programs. The ray tracer I use in particular can make use of any number of cores or even distributed machines. I don't play games any more so I really don't care about that.

            In the past the Athlon FPU ran circles around the Pentium FPU at a rate of about 2 to 1 according to many benchmarks on the various distributed computing projects. Has that changed? If so, how?
            Free software for calculating bolt circles and similar: Click Here

            Comment


            • #7
              Originally posted by Evan
              Robert,

              I am building a new computer and have been out of the loop for a while. I want your Intel-centric opinion on what is the best CPU out there for severe number crunching on stuff like CAD and Raytracing programs. The ray tracer I use in particular can make use of any number of cores or even distributed machines. I don't play games any more so I really don't care about that.

              In the past the Athlon FPU ran circles around the Pentium FPU at a rate of about 2 to 1 according to many benchmarks on the various distributed computing projects. Has that changed? If so, how?
              The best FPU out there for ray tracing is prob. the Nvidia GPUs w/ CUDA; this gives a pretty amazing number of compute engines for low $$$.

              http://www.nvidia.com/object/cuda_home.html#

              If you're looking at doing number crunching on a traditional single socket CPU, I'd use an Intel i7 for maximum performance at this point.

              http://arstechnica.com/hardware/news...ent-market.ars

              The i7s are very fast indeed.

              Oblig. disclaimer: I do kernel performance work at Sun Microsystems, where we build machines w/ AMD, Intel, and various SPARC cpus.

              - Bart
              Bart Smaalders
              http://smaalders.net/barts

              Comment


              • #8
                With the tools available today programming in C++ is very much like programming in Visual Basic. A lot of it is just drag/drop. It's not like it was when I started writing C code using the DOS edlin editor. Especially easier than entering object code with paddle switches on the first computers I had access to. The very first computer I programmed belonged to Farmer's Insurance in Berkeley and used jumper wires and a screw driver.

                Comment


                • #9
                  The best FPU out there for ray tracing is prob. the Nvidia GPUs w/ CUDA; this gives a pretty amazing number of compute engines for low $$$.
                  I don't mean ray tracing in realtime for display purposes. I am talking about crunching an image for later display. Some of these images take hours to compute, some even days. The ray tracer is POV-Ray and it supports true physical modeling of most optical phenomena. Some are enormously computationally intensive such as atmospheric scattering in any of several models such as Raleigh, Mie-Murky and Henyey-Greenstein.

                  Is it possible for the OS software to use the Nvidia FPU in that manner, similar to the cell processor on the PlayStation?
                  Last edited by Evan; 04-05-2009, 05:58 PM.
                  Free software for calculating bolt circles and similar: Click Here

                  Comment


                  • #10
                    Originally posted by dp
                    With the tools available today programming in C++ is very much like programming in Visual Basic. A lot of it is just drag/drop. It's not like it was when I started writing C code using the DOS edlin editor. Especially easier than entering object code with paddle switches on the first computers I had access to. The very first computer I programmed belonged to Farmer's Insurance in Berkeley and used jumper wires and a screw driver.
                    hee-hee..... putting in the boot program with toggles...... THAT takes me back a ways...... like 4k of core......

                    I no longer recall how the Bendix G-15 booted, that was the first one I used.
                    1601

                    Keep eye on ball.
                    Hashim Khan

                    Comment


                    • #11
                      Hi Evan,

                      I'm probably a bit biased since I architected some of the of multimedia (SSE4 instructions), but Nehalem, A.K.A. "Core i7" is a totally new microarchitecture and IMHO the best processor that Intel has built since P6 (Pentium Pro).

                      Nehalem's IPC (instructions per clock) is a big leap over Penryn (Core 2), and Penryn has a big advantage in IPC over Phenom. Each Nehalem core has a full-width (128-bit) floating point unit, and the new SSE4 Multimedia instructions exploit that hardware. A 3-channel integrated memory controller substantially reduces memory latency, and Nehalem has the new CSI ("QPI") point-to-point FSB.

                      Originally posted by Evan
                      The ray tracer I use in particular can make use of any number of cores or even distributed machines.
                      Nehalem is a quad-core, with each core having 2 hardware threads == 8 logical processors. So each Nehalem core is an 8-processor SMP (symmetric multiprocessor). If your raytracer is multithreaded, and especially if it's optimized for SSE 2/3/4, it will be very, very happy on Nehalem.

                      Some random benchmarks from Tom's Hardware (Anandtech has similar results). Nehalem has a huge performance margin over its contemporaries:

                      http://www.tomshardware.com/charts/d...e-CPU,817.html
                      http://www.tomshardware.com/charts/d...e-MP3,828.html
                      http://www.tomshardware.com/charts/d...yback,834.html
                      http://www.tomshardware.com/charts/d...-CS-3,826.html

                      You can get a quad-core Nehalem for around $280 at Newegg et al:

                      http://www.newegg.com/Product/Produc...82E16819115202
                      Last edited by lazlo; 04-05-2009, 10:54 PM.
                      "Twenty years from now you will be more disappointed by the things that you didn't do than by the ones you did."

                      Comment


                      • #12
                        Thanks Robert. POV-Ray is currently at version 3.6 with 3.7 sheduled soon. 3.7 is fully multithreaded and so will take advantage of whatever is available.

                        I will look at the prices here on Tiger Direct and see what they have and if I can afford it. I'm on a budget but computer parts are really cheap right now. What's the next best in your opinion?
                        Free software for calculating bolt circles and similar: Click Here

                        Comment


                        • #13
                          I no longer recall how the Bendix G-15 booted, that was the first one I used.
                          It was the first machine I used as well. It booted, if you can call it that, with a hardware instruction decoder that allowed it to read the rotating drum memory and load up a very simple IO routine for the teletype. There wasn't much to boot as it came up with no software at all, just an octal input "editor with no editing capabability to directly input cpu instructions, of which there were only about 20 or so. It had only one branch instruction, branch if zero.
                          Free software for calculating bolt circles and similar: Click Here

                          Comment


                          • #14
                            never went the programmer route only had to do enough to trouble shoot. but first cpu i worked on had heated core memory and we entered the bootup with pushbuttons which started a teletype punch reader.

                            Comment


                            • #15
                              Originally posted by Evan
                              Is it possible for the OS software to use the Nvidia FPU in that manner, similar to the cell processor on the PlayStation?
                              Actually, the OS really doesn't get involved other than to set up device mappings, AFAIK...

                              The programming model for CUDA is ... tricky... in terms of synchronization
                              for good performance; I'd guess for POV-ray you're better off w/ programming the main CPU.

                              BTW, my son and I built a quad core i7 (Nehalem) for my daughter this past Christmas; w/ 6G of triple channel DDR3 it's very fast indeed.

                              - Bart
                              Bart Smaalders
                              http://smaalders.net/barts

                              Comment

                              Working...
                              X