Announcement

Collapse
No announcement yet.

Wolverine-Hawkeye Telecine

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Hi if it's possible test to capture to Y8 or Y800 format and test to use this Debayer https://www.fastcompression.com/products/debayer.htm
    it´s command-line interface program

    i use it i get better picture quality then The image source IC capture programs own Debayer
    this is my camera i use The image source DFK 23G274 the lens i use is Rodenstock APO_Rodagon D 75mm f4.0 it´s my sharpest and best lens i have found but it´s to big for Wolverine :/ and it´s only good in 1x zoom but if the camera sensor is about super 8mm size then it´s good my camera sensor is 1/1.8 inch size

    here is a test film capture https://www.flickr.com/photos/94271811@N03/11013222224/

    here is my .bat file code i run to debayer my image sequence

    Code:
    SET Input_PATCH=D:\film\Y8
    SET OUTPUT_PATCH=F:\film\Debayer
    
    
    SET start=0
    SET end=32466
    
    
    
    FOR /L %%i IN (%start%,1,%end%) DO (CALL :loopbody "%%i")
    GOTO :eof
    
    
    :loopbody
    convert.exe "%Input_PATCH%\image%1.tiff" "D:\Temp\image%1.pgm"
    fast_debayer.exe -i "D:\Temp\image%1.pgm" -type MG -pattern RGGB -d 0 -o "%OUTPUT_PATCH%\image%1.ppm"
    DEL /Q "D:\Temp\*.*"
    GOTO :eof
    it first convert with ImageMagick the .tiff Y8 format to .pgm Y8 format because fast_debayer.exe only likes .pgm format and save the .pgm file to D:\Temp\ temporary
    then fast_debayer.exe debayers it to .ppm format and save it to the SET OUTPUT_PATCH=F:\film\Debayer
    then it delete the file inside D:\Temp\
    then the loop starts all over again

    SET Input_PATCH=D:\film\Y8 here is .tiff Y8 image sequence image1.tiff image2.tiff and so on to image32466.tiff
    Last edited by Mattias Norberg; July 03, 2020, 12:15 PM.

    Comment


    • Thank you for sharing this Mattias. I will try it with my setup with ux226 camera. You camera and lens may work with Wolverine. You would have to remove the controller and install the Hawkeye mod. Then attach the camera at 300 mm above the film. That would clear the Wolverine front panel. Then rig up a mounting bracket for the camera.
      The image that you provided looks very good and would be interesting to see the comparison.

      Comment


      • i do always two captures of the film Low and High exposure and then merge with Avisynth to get HDR and i use Avisynth for the denoise to for color correction i use davinci resolve
        maybe something good for the Wolverine to do HDR capture i mean capture Low and High exposure on difficult films

        here is my latest film HDR capture https://www.youtube.com/watch?v=7eZRYu56Opg

        Comment


        • This is how my machine runs. In case others are different.

          A timelapse capture of the Hawkeye takeup speed.

          Hawkeye is running at 2fps, using the standard Wolverine takeup reel. The timelapse is captured at 1 frame per minute or 120 film frames per minute on the Hawkeye.

          https://drive.google.com/file/d/1HWh...ew?usp=sharing

          The takeup reel tightens up around 15 minutes, I could have re-threaded the pulleys then.

          At 1h:20m a magnet apears that I use as a brake on the supply reel. It starts to spin and spill film.

          All the jpegs are here if it is difficult to start stop the video.

          https://drive.google.com/drive/folde...7z?usp=sharing



          Thanks for sharing Mattias, that was very impressive. The original cinematography was excellent, the transfer is equally impressive. I had wondered if the Y8 codec was usable.
          Last edited by David Brown; July 03, 2020, 08:35 PM.

          Comment


          • Will test it on mine again but when I ran it a while ago it behaved very similar to yours David.

            Comment


            • Originally posted by David Brown View Post
              Thanks for sharing Mattias, that was very impressive. The original cinematography was excellent, the transfer is equally impressive. I had wondered if the Y8 codec was usable.
              Thanks the GPU Debayer can convert Y16 to RGB 48bit but i did not see any difference except i did get bigger file maybe i need better monitor to see the difference maybe my The image source DFK 23G274 is to old

              FireCapture does also have good Debayer algorithm http://www.firecapture.de but GPU Debayer is better but it´s long time ago i did test FireCapture
              Last edited by Mattias Norberg; July 04, 2020, 04:42 AM.

              Comment


              • Hi Mattias, I see the same issue when I go to 64bit (12 bits per channel). The issue is that even at 12 bits the details are not there. I take the cloudy sky that is all white and in photoshop (16bit mode) I select the sky only and expand the histogram and there is nothing. Will have to check this again.

                Comment


                • Originally posted by Stan Jelavic View Post
                  Hi Mattias, I see the same issue when I go to 64bit (12 bits per channel). The issue is that even at 12 bits the details are not there. I take the cloudy sky that is all white and in photoshop (16bit mode) I select the sky only and expand the histogram and there is nothing. Will have to check this again.
                  ok maybe in the future when we have better monitors

                  but Debayer algorithms have some quality difference here some examples

                  https://www.fastcompression.com/prod...ayer-moire.htm
                  https://www.apertus.org/what-is-deba...e-october-2015
                  http://blog.cuvilib.com/2014/06/12/dfpd-debayer-on-gpu/
                  https://www.samys.com/images/pdf/Debayering_API.pdf
                  Last edited by Mattias Norberg; July 04, 2020, 11:58 AM.

                  Comment


                  • Yes, I had another look at the HDR issue and UX226 should be capable of producing the wide dynamic range but the issue starts right at the beginning during fine tuning of the system. I use a laptop with a limited display capabilities and adjust the exposure reference for nice midtones and vivid picture. But that clips the high end even with the 12 bits available. My guess is that with the HD monitor the exposure would be shorter and the high end would not be clipped but if you would look at the resulting image on a regular monitor it would look washed out. That is my hunch anyways but could be wrong.

                    On another front, spent some time over the weekend working on capstan. So now what I have is a smooth pulley in 3rd position connected to the motor with the motor white cable disconnected (no drive to the motor but encoder running).
                    The capstan operation runs only in slow speed mode (turbo can still be on for 1 FPS).
                    The takeup switch has to be also on to enable capstan operation.
                    The takeup torque is higher for the first 1800 frames:
                    https://photos.app.goo.gl/HBfVgFvXP1MycB989

                    Then switches to lower torque for the rest of the reel:
                    https://photos.app.goo.gl/oqbfjPL4a5xM68yh9

                    This way the capstan is always engaged:
                    https://photos.app.goo.gl/wgfUcgtjSNMgYfZKA

                    If there is a jam, the stepper turns off and the sonalert blasts out two beeps.
                    https://photos.app.goo.gl/UeW5gFiDHU8bcm1CA

                    The scanner stays in this off mode until it is attended. There are no more beeps from the sonalert (that is the ways it was designed).
                    The procedure after that is to turn off the run switch first and then turn off the takeup switch.
                    If the takeup is turned off first the off mode will get locked and a power restart will be required.

                    One more note, the takeup switch controls the capstan operation but does not impact the low/high torque operation.
                    Wanted to add that but ran out of MSP430 cycles.
                    Will do another check of the frame sync, camera sync and takeup timing tomorrow.
                    Something to do while waiting for the lens I guess.





                    Comment


                    • I don´t know if this is any good but i put here my Avisynth script i use to get perfect align sprocket hole on my films it´s slow script but it workd very good

                      Code:
                      SetMemoryMax(1920)
                      SetMTMode(2,2)
                      
                      
                      
                      film=ImageSource("D:\Super8MOVIE\image%d.tiff",start=1,end=100,fps=18).Convert ToYV12()
                      
                      
                      Return sprocketAlign(film,8,100)   # film= film clip ,8= still frame number ,100= loop the same frame 100 times
                      
                      
                      
                      
                      
                      #############################...Functions...#####################################
                      
                      function sprocketAlign(clip c1,int frame,int time_frames)
                      {
                      c2=c1.Trim(frame,frame).loop(time_frames)
                      c = Interleave(c2, c1)
                      
                      r1=overlay(c2,c2,x=0,y=0,mask=c2,opacity=1.0,greymask=true,mode="HardLight",pc_range=true).greyscale ().NonlinUSM(1.2,2.6,6.0,8.5).invert()
                      a_ref=overlay(c2,r1,x=0,y=0,mask=c2,opacity=1.0,greymask=true,mode="Exclusion",pc_range=true).NonlinUSM(3.0,3.0,7.0,12.5).HighlightLimiter(1,true,1,true,100).invert().GaussianBlur(VarY=1).MT_binarize( threshold=1).greyscale().invert().crop(4,30,-1500,-30)
                      
                      
                      
                      r2=overlay(c1,c1,x=0,y=0,mask=c1,opacity=1.0,greymask=true,mode="HardLight",pc_range=true).greyscale ().NonlinUSM(1.2,2.6,6.0,8.5).invert()
                      b_ref=overlay(c1.flick(),r2,x=0,y=0,mask=c1,opacity=1.0,greymask=true,mode="Exclusion",pc_range=true ).NonlinUSM(3.0,3.0,7.0,12.5).HighlightLimiter(1,true,1,true,100).invert().GaussianBlur(VarY=1).MT_binarize(threshold=1).greyscale().invert().crop(4,30,-1500,-30)
                      
                      
                      c_ref = Interleave(a_ref, b_ref)
                      # calculate stabilization data
                      mdata = DePanEstimate(c_ref,trust=0.01,dxmax=0,dymax=150)
                      # stabilize
                      c_stab = DePanInterleave(c, data=mdata)
                      
                      b_stab = c_stab.SelectEvery(6, 2)
                      #return StackHorizontal(b_ref,a_ref) # use this to fix the Crop
                      return b_stab
                      }
                      
                      
                      
                      
                      
                      
                      
                      
                      
                      
                      
                      
                      function flick(clip e)
                      {
                      o = e
                      sm = o.bicubicresize(88,64).grayscale() # can be altered, but ~25% of original resolution seems reasonable
                      smm = sm.temporalsoften(1,32,255,24,2).merge(sm,0.25)
                      smm = smm.temporalsoften(2,12,255,20,2)
                      o2 = o.mt_makediff(mt_makediff(sm,smm,U=3,V=3).bicubicresize(width(o),height(o),0,0),U=3,V=3)
                      return o2
                      }
                      
                      
                      
                      function NonlinUSM(clip o, float "z", float "pow", float "str", float "rad", float "ldmp")
                      {
                      z = default(z, 6.0) # zero point
                      pow = default(pow, 1.6) # power
                      str = default(str, 1.0) # strength
                      rad = default(rad, 9.0) # radius for "gauss"
                      ldmp= default(ldmp, 0.001) # damping for verysmall differences
                      
                      g = o.bicubicresize(round(o.width()/rad/4)*4,round(o.height()/rad/4)*4).bicubicresize(o.width(),o.height(),1,0)
                      
                      mt_lutxy(o,g,"x x y - abs "+string(z)+" / 1 "+string(pow)+" / ^ "+string(z)+" * "+string(str)+
                      \ " * x y - 2 ^ x y - 2 ^ "+string(ldmp)+" + / * x y - x y - abs 0.001 + / * +",U=2,V=2)
                      
                      return(last)
                      }
                      
                      
                      
                      function HighlightLimiter(clip v, float "gblur", bool "gradient", int "threshold", bool "twopass", int "amount", bool "softlimit", int "method")
                      {
                      gradient = default (gradient,true) #True uses the gaussian blur to such an extent so as to create an effect similar to a gradient mask being applied to every area that exceeds our threshold.
                      gblur = (gradient==true) ? default (gblur,100) : default (gblur,5.0) #The strength of the gaussian blur to apply.
                      threshold = default (threshold,150) #The lower the value, the more sensitive the filter will be.
                      twopass = default (twopass,false) #Two passes means the area in question gets darkened twice.
                      amount = default (amount,10) #The amount of brightness to be reduced, only applied to method=2
                      softlimit = default (softlimit,false) #If softlimit is true, then the values around the edges where the pixel value differences occur, will be averaged.
                      method = default (method, 1) #Method 1 is multiply, the classic HDR-way. Any other method set triggers a brightness/gamma approach.
                      
                      amount = (amount>0) ? -amount : amount
                      
                      darken=v.Tweak(sat=0).mt_lut("x "+string(threshold)+" < 0 x ?")
                      blurred= (gradient==true) ? darken.gaussianblur(gblur).gaussianblur(gblur+100) .gaussianblur(gblur+200) : darken.gaussianblur(gblur)
                      fuzziness_mask=blurred.mt_edge(mode="prewitt", Y=3, U=2, V=2).mt_expand(mode="both", Y=3, U=2, V=2)
                      multiply = (method==1) ? mt_lut(v,"x x * 255 /") : v.Tweak(bright=amount)
                      multiply = (method==1) ? eval("""(twopass==true) ? mt_lutxy(multiply,v,"x y * 255 /") : multiply""") : eval("""(twopass==true) ? multiply.SmoothLevels(gamma=0.9,smode=2) : multiply""")
                      
                      merged=mt_merge(v,multiply,blurred)
                      fuzzy= (softlimit==true) ? mt_merge(merged,mt_lutxy(v,merged,"x y + 2 /"),fuzziness_mask) : merged
                      return fuzzy
                      }
                      inside sprocketAlign function

                      edit the two crop(4,30,-1500,-30) and use return StackHorizontal(b_ref,a_ref) to only see the sprocket hole in the film but if the film resolution is the same as i have 1600x1200 then no need to edit the crop i think

                      dymax=150 is the vertical adjustment in pixels i only use vertical adjustment
                      dxmax=0 this is Horizontal adjustment i have it 0 so it does not fix the Horizontal adjustment


                      i put a screenshot how StackHorizontal(b_ref,a_ref) looks when it´s goodClick image for larger version  Name:	SprocketALIGN.jpg Views:	0 Size:	6.0 KB ID:	13023
                      in the picture the left one align to the right one that is stationary and loops throughout the film

                      Return sprocketAlign(film,8,100) film= film clip ,8= still frame number ,100= loop the same frame 100 times
                      the number 100 is how many times it loops set this the same number that total frames is in the film

                      number 8 is the frame number that I have chosen here and all the frames in the film is going to align to this number 8 frame

                      there was some errors in the code but i did fix it now
                      Last edited by Mattias Norberg; July 06, 2020, 10:01 AM.

                      Comment


                      • i put my HDR avisynth script here to
                        i use 6watt LED ligh
                        Low exposure Clip was on exposure 1/1070 Sec in IC Capture
                        High exposure Clip was on exposure 1/66 Sec in IC Capture

                        Code:
                        a=ImageSource("F:\Super8mm_Sound\N\image%d.ppm",start=1,end=14986,fps=18).ConvertToYV12() # Low exposure Clip
                        b=ImageSource("F:\Super8mm_Sound\E\image%d.ppm",start=1,end=14986,fps=18).ConvertToYV12() # High exposure Clip
                        
                        
                        
                        #return HDR1(a,b)
                        
                        return overlay(HDR1(a,b),HDR2(a,b),x=0,y=0,mask=HDR1(a,b) ,opacity=1.0,greymask=true,mode="Blend",pc_range=true)
                        
                        
                        
                        
                        
                        
                        
                        
                        
                        
                        
                        ####################...Functions...#######################
                        
                        
                        
                        function HDR1(clip a_stab,clip b_stabb)
                        {
                        b_stab=b_stabb.TurnLeft().HDRAGC(max_gain =5.0,min_gain=0.5,coef_gain=2.0,coef_sat=1.25,MODE =2,shadows=true,protect=1,reducer=0,corrector=0.0) .TurnRight().ColorYUV(off_u=-12,gain_u=0)
                        
                        t=a_stab.TurnLeft().HDRAGC(max_gain = 3.0,min_gain=0.5,coef_gain=2.0,coef_sat=2.00,MODE= 2,shadows=true,protect=1,reducer=0,corrector=0.0). TurnRight()
                        
                        ab=overlay( b_stab,t,x=0,y=0,mask=b_stab,opacity=0.4,greymask= true,mode="Multiply",pc_range=true).ColorYUV(off_y =0,gain_y=30)
                        abc=overlay( b_stab,t,x=0,y=0,mask=b_stab,opacity=1.0,greymask= true,mode="Difference",pc_range=true)
                        ass=overlay(ab,abc,x=0,y=0,mask=invert(ab),opacity =1.0,greymask=true,mode="Blend",pc_range=true).ColorYUV(off_y=0,gain_y=0)
                        
                        ab1=overlay(t,ass,x=0,y=0,mask=t,opacity=0.15,greymask=true,mode="SoftLight",pc_range=true)
                        
                        l=overlay(ass,ab1,x=0,y=0,mask=ass,opacity=1.0,greymask=true,mode="Blend",pc_range=true).ColorYUV(off_y=0,gain_y=3)
                        ll=overlay(ab1,ass,x=0,y=0,mask=ab1,opacity=1.0,greymask=true,mode="Darken",pc_range=true).ColorYUV(off_y=-19,gain_y=22)
                        k=overlay(l,ll,x=0,y=0,mask=l,opacity=1.0,greymask =true,mode="Blend",pc_range=true)
                        
                        return k
                        }
                        
                        
                        
                        
                        function HDR2(clip a_stab,clip b_stab)
                        {
                        t=a_stab.coloryuv(autowhite=false).TurnLeft().HDRAGC(max_gain = 5.0,min_gain=0.5,coef_gain=2.0,coef_sat=2.00,MODE= 2,shadows=true,protect=1,reducer=0,corrector=0.0). TurnRight()
                        ab=overlay( b_stab,t,x=0,y=0,mask=b_stab,opacity=0.5,greymask= true,mode="Multiply",pc_range=true).ColorYUV(off_y =0,gain_y=32)
                        
                        ab1=overlay(t,b_stab,x=0,y=0,mask=t,opacity=0.5,greymask=true,mode="hardlight",pc_range=true)
                        l=overlay(ab,ab1,x=0,y=0,mask=ab,opacity=1.0,greymask=true,mode="blend",pc_range=true)
                        last=overlay(l,t,x=0,y=0,mask=ab,opacity=1.0,greymask=true,mode="blend",pc_range=true).TurnLeft().HDRAGC(max_gain = 1.5,min_gain=0.1,coef_gain=2.0,coef_sat=0.90,MODE= 1,shadows=true,protect=1,corrector=0.45).TurnRight ()
                        return last.ColorYUV(off_y=0,gain_y=22)
                        }

                        HDR AGC plugin download from here https://forum.doom9.org/showthread.php?t=93571

                        Here is some example

                        Low exposure Pic Click image for larger version  Name:	Low.jpg Views:	0 Size:	53.3 KB ID:	13035


                        High exposure PicClick image for larger version  Name:	High.jpg Views:	0 Size:	101.1 KB ID:	13036


                        HDR Pic = Low + High PicClick image for larger version  Name:	HDR.jpg Views:	0 Size:	105.8 KB ID:	13037

                        HDR + Denoise + color correction davinci resolve Click image for larger version  Name:	Resolve.jpg Views:	0 Size:	107.2 KB ID:	13038
                        Last edited by Mattias Norberg; July 06, 2020, 11:34 AM.

                        Comment


                        • I couldn't resist a nit of manual processing to see what was actually there. (Adjustments to gamma, brightness and contrast in shadows and highlights). Looks very good to me but doing that for each frame would take a long while an image processor with batch processing could work.

                          Comment


                          • Originally posted by Brian Fretwell View Post
                            I couldn't resist a nit of manual processing to see what was actually there. (Adjustments to gamma, brightness and contrast in shadows and highlights). Looks very good to me but doing that for each frame would take a long while an image processor with batch processing could work.
                            looks good did you use photoshop or avisynth ?

                            the hdr avisynth script i have done is only trial and error and i guess it works ok from scene to scene but not perfect of course

                            my camera is not so good so i have to do 2 exposure capture
                            Last edited by Mattias Norberg; July 06, 2020, 11:29 AM.

                            Comment


                            • I used a very old version of Paint Shop Pro (V4.15) with masking for highlights and shadows. First I lowered the gamma in the highlights then increased with the mask reversed to do the shadows. As this flattened the range I gave it increased brightness and contrast in the shadows, reversed the mask back and very slightly decreased gamma again.

                              A bit long winded but that's what I have been doing with the many (thousands) still photos I have scanned and built up the technique over the years. Despite being able to adjust the scanner settings the brightness range of the originals is too great to just use the raw output.

                              Comment


                              • I put my Denoise Avisynth script here to it's based on videofreds script

                                Code:
                                SetMemoryMax(1920)
                                SetMTMode(5,5)
                                
                                
                                a=ImageSource("F:\Super8mm_Sound\HDR2\image%d.tiff",start=0,end=3629,fps=18).ConvertToYV12()
                                
                                
                                
                                
                                return denoise(a)
                                
                                
                                
                                
                                ###############...Functions...#####################################
                                
                                
                                
                                
                                
                                
                                function denoise(clip movie)
                                {
                                block_size=16
                                block_over=block_size/2
                                USM_sharp_ness=28 USM_radi_us=3
                                USM_sharp_ness1 = USM_sharp_ness
                                USM_sharp_ness2 = USM_sharp_ness+(USM_sharp_ness/4)
                                USM_sharp_ness3 = USM_sharp_ness*4
                                USM_radi_us1 = USM_radi_us
                                USM_radi_us2 = USM_radi_us-1
                                USM_radi_us3 = USM_radi_us2-1
                                
                                cleaned=movie.RemoveGrain(2).unsharpmask(USM_sharp_ness1,USM_radi_us1,0).RemoveGrain(2).unsharpmask( USM_sharp_ness2,USM_radi_us2,0).RemoveGrain(2)
                                
                                vectors= MVAnalyseMulti(cleaned,refframes=6, pel=2,truemotion=true, blksize=block_size ,blksizev=block_size, overlap=block_over,dct=0,idx=1,search=1,threads=4, prefetch=1)
                                
                                T=MVDegrainMulti(cleaned,vectors, thSAD=1100,thSADC=500,limit=15, SadMode=0, idx=2,plane=4,threads=4).unsharpmask(USM_sharp_ness3,USM_radi_us3,0)
                                b2=t.mgrain33().mgrain3().NonlinUSM(1.2,1.5,1.2,3.5)
                                b22=overlay(b2,movie,mask=b2.GaussianBlur(VarY=20) ,opacity=1.0,greymask=true,mode="blend",pc_range=true).unsharpmask(30,3,0).sharpen(0.2)
                                return b22
                                }
                                
                                
                                
                                function mgrain3(clip last)
                                {
                                blksize_size=4
                                overlap_size=blksize_size/2
                                dct=0
                                tmotion=false
                                super= last.MSuper(pel=2)
                                bv1 = MAnalyse(super, isb = true, delta=1,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv1 = MAnalyse(super, isb = false, delta=1,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                bv2 = MAnalyse(super, isb = true, delta=2,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv2 = MAnalyse(super, isb = false, delta=2,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                bv3 = MAnalyse(super, isb = true, delta=3,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv3 = MAnalyse(super, isb = false, delta=3,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                l=last.MDegrain3(super, bv1,fv1,bv2,fv2,bv3,fv3,plane=4,thSAD=650,thSADC=100,limit=15)
                                return l
                                }
                                
                                
                                
                                function mgrain33(clip last)
                                {
                                blksize_size=32
                                overlap_size=blksize_size/2
                                dct=0
                                tmotion=false
                                super= last.MSuper(pel=2)
                                bv1 = MAnalyse(super, isb = true, delta=1,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv1 = MAnalyse(super, isb = false, delta=1,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                bv2 = MAnalyse(super, isb = true, delta=2,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv2 = MAnalyse(super, isb = false, delta=2,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                bv3 = MAnalyse(super, isb = true, delta=3,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                fv3 = MAnalyse(super, isb = false, delta=3,truemotion=tmotion,blksize=blksize_size,overlap=overlap_size, dct=dct)
                                l=last.MDegrain3(super, bv1,fv1,bv2,fv2,bv3,fv3,plane=4,thSAD=250,thSADC=100,limit=15)
                                return l
                                }
                                
                                
                                
                                
                                
                                
                                
                                
                                
                                function NonlinUSM(clip o, float "z", float "pow", float "str", float "rad", float "ldmp")
                                {
                                z = default(z, 6.0) # zero point
                                pow = default(pow, 1.6) # power
                                str = default(str, 1.0) # strength
                                rad = default(rad, 9.0) # radius for "gauss"
                                ldmp= default(ldmp, 0.001) # damping for verysmall differences
                                
                                g = o.bicubicresize(round(o.width()/rad/4)*4,round(o.height()/rad/4)*4).bicubicresize(o.width(),o.height(),1,0)
                                
                                mt_lutxy(o,g,"x x y - abs "+string(z)+" / 1 "+string(pow)+" / ^ "+string(z)+" * "+string(str)+
                                \ " * x y - 2 ^ x y - 2 ^ "+string(ldmp)+" + / * x y - x y - abs 0.001 + / * +",U=2,V=2)
                                
                                return(last)
                                }

                                Comment

                                Working...
                                X